ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, Nov 14 2023, 16:14:06) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file [WARNING]: running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_luks_scsi_generated.yml **************************************** 2 plays in /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks_scsi_generated.yml PLAY [Run test tests_luks.yml for scsi] **************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks_scsi_generated.yml:3 Saturday 31 August 2024 18:35:32 -0400 (0:00:00.027) 0:00:00.027 ******* ok: [managed_node2] META: ran handlers TASK [Set disk interface for test] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks_scsi_generated.yml:8 Saturday 31 August 2024 18:35:33 -0400 (0:00:01.478) 0:00:01.505 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_use_interface": "scsi" }, "changed": false } META: ran handlers META: ran handlers PLAY [Test LUKS] *************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:2 Saturday 31 August 2024 18:35:33 -0400 (0:00:00.087) 0:00:01.592 ******* ok: [managed_node2] META: ran handlers TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:20 Saturday 31 August 2024 18:35:34 -0400 (0:00:00.767) 0:00:02.360 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:24 Saturday 31 August 2024 18:35:34 -0400 (0:00:00.048) 0:00:02.408 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:34 Saturday 31 August 2024 18:35:34 -0400 (0:00:00.059) 0:00:02.468 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:40 Saturday 31 August 2024 18:35:34 -0400 (0:00:00.049) 0:00:02.518 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:49 Saturday 31 August 2024 18:35:34 -0400 (0:00:00.061) 0:00:02.579 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:53 Saturday 31 August 2024 18:35:34 -0400 (0:00:00.073) 0:00:02.653 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 31 August 2024 18:35:34 -0400 (0:00:00.071) 0:00:02.724 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 31 August 2024 18:35:34 -0400 (0:00:00.055) 0:00:02.779 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 31 August 2024 18:35:35 -0400 (0:00:00.057) 0:00:02.837 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed_node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 31 August 2024 18:35:35 -0400 (0:00:00.089) 0:00:02.926 ******* ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 31 August 2024 18:35:35 -0400 (0:00:00.514) 0:00:03.441 ******* ok: [managed_node2] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 31 August 2024 18:35:35 -0400 (0:00:00.089) 0:00:03.530 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 31 August 2024 18:35:35 -0400 (0:00:00.038) 0:00:03.569 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 31 August 2024 18:35:35 -0400 (0:00:00.038) 0:00:03.608 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 31 August 2024 18:35:35 -0400 (0:00:00.135) 0:00:03.744 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 31 August 2024 18:35:37 -0400 (0:00:01.601) 0:00:05.345 ******* ok: [managed_node2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 31 August 2024 18:35:37 -0400 (0:00:00.048) 0:00:05.393 ******* ok: [managed_node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 31 August 2024 18:35:37 -0400 (0:00:00.046) 0:00:05.440 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 31 August 2024 18:35:38 -0400 (0:00:00.708) 0:00:06.149 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 31 August 2024 18:35:38 -0400 (0:00:00.082) 0:00:06.231 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 31 August 2024 18:35:38 -0400 (0:00:00.020) 0:00:06.251 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 31 August 2024 18:35:38 -0400 (0:00:00.022) 0:00:06.274 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 31 August 2024 18:35:38 -0400 (0:00:00.019) 0:00:06.293 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 31 August 2024 18:35:39 -0400 (0:00:00.641) 0:00:06.935 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 31 August 2024 18:35:40 -0400 (0:00:01.148) 0:00:08.083 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 31 August 2024 18:35:40 -0400 (0:00:00.113) 0:00:08.197 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 31 August 2024 18:35:40 -0400 (0:00:00.046) 0:00:08.243 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 31 August 2024 18:35:40 -0400 (0:00:00.569) 0:00:08.813 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 31 August 2024 18:35:41 -0400 (0:00:00.036) 0:00:08.849 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143712.4371896, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "72884e3f126482c2d28276ff7c57744fa95eff91", "ctime": 1725143711.9231806, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263550, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1725143711.9231806, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1229, "uid": 0, "version": "18446744071584693682", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 31 August 2024 18:35:41 -0400 (0:00:00.377) 0:00:09.227 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 31 August 2024 18:35:41 -0400 (0:00:00.060) 0:00:09.287 ******* TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 31 August 2024 18:35:41 -0400 (0:00:00.058) 0:00:09.346 ******* ok: [managed_node2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 31 August 2024 18:35:41 -0400 (0:00:00.054) 0:00:09.401 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 31 August 2024 18:35:41 -0400 (0:00:00.053) 0:00:09.455 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 31 August 2024 18:35:41 -0400 (0:00:00.045) 0:00:09.500 ******* TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 31 August 2024 18:35:41 -0400 (0:00:00.046) 0:00:09.547 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 31 August 2024 18:35:41 -0400 (0:00:00.055) 0:00:09.602 ******* TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 31 August 2024 18:35:41 -0400 (0:00:00.036) 0:00:09.639 ******* TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 31 August 2024 18:35:41 -0400 (0:00:00.045) 0:00:09.684 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 31 August 2024 18:35:41 -0400 (0:00:00.051) 0:00:09.735 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725142921.7026803, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1718879272.062, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131079, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1718879026.308, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744072852913879", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 31 August 2024 18:35:42 -0400 (0:00:00.454) 0:00:10.190 ******* TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 31 August 2024 18:35:42 -0400 (0:00:00.043) 0:00:10.234 ******* ok: [managed_node2] TASK [Get unused disks] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:57 Saturday 31 August 2024 18:35:44 -0400 (0:00:01.813) 0:00:12.048 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed_node2 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Saturday 31 August 2024 18:35:44 -0400 (0:00:00.148) 0:00:12.196 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } lsrpackages: util-linux TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Saturday 31 August 2024 18:35:45 -0400 (0:00:00.863) 0:00:13.060 ******* ok: [managed_node2] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"ext4\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"ext4\" LOG-SEC=\"512\"", "Disk [/dev/xvda] attrs [{'fstype': '', 'type': 'disk', 'ssize': '512', 'size': '268435456000'}] is not an interface [scsi]" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Saturday 31 August 2024 18:35:45 -0400 (0:00:00.606) 0:00:13.667 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Saturday 31 August 2024 18:35:45 -0400 (0:00:00.041) 0:00:13.708 ******* ok: [managed_node2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Saturday 31 August 2024 18:35:45 -0400 (0:00:00.039) 0:00:13.748 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Saturday 31 August 2024 18:35:45 -0400 (0:00:00.031) 0:00:13.780 ******* ok: [managed_node2] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:66 Saturday 31 August 2024 18:35:45 -0400 (0:00:00.036) 0:00:13.816 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed_node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 31 August 2024 18:35:46 -0400 (0:00:00.072) 0:00:13.888 ******* ok: [managed_node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 31 August 2024 18:35:46 -0400 (0:00:00.038) 0:00:13.927 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 31 August 2024 18:35:46 -0400 (0:00:00.092) 0:00:14.020 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 31 August 2024 18:35:46 -0400 (0:00:00.072) 0:00:14.092 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 31 August 2024 18:35:46 -0400 (0:00:00.063) 0:00:14.155 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed_node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 31 August 2024 18:35:46 -0400 (0:00:00.122) 0:00:14.277 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 31 August 2024 18:35:46 -0400 (0:00:00.047) 0:00:14.325 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 31 August 2024 18:35:46 -0400 (0:00:00.053) 0:00:14.378 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 31 August 2024 18:35:46 -0400 (0:00:00.046) 0:00:14.424 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 31 August 2024 18:35:46 -0400 (0:00:00.039) 0:00:14.464 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 31 August 2024 18:35:46 -0400 (0:00:00.121) 0:00:14.586 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 31 August 2024 18:35:48 -0400 (0:00:01.402) 0:00:15.988 ******* ok: [managed_node2] => { "storage_pools": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 31 August 2024 18:35:48 -0400 (0:00:00.058) 0:00:16.047 ******* ok: [managed_node2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 31 August 2024 18:35:48 -0400 (0:00:00.061) 0:00:16.108 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 31 August 2024 18:35:52 -0400 (0:00:04.013) 0:00:20.122 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 31 August 2024 18:35:52 -0400 (0:00:00.104) 0:00:20.227 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 31 August 2024 18:35:52 -0400 (0:00:00.065) 0:00:20.293 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 31 August 2024 18:35:52 -0400 (0:00:00.047) 0:00:20.341 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 31 August 2024 18:35:52 -0400 (0:00:00.047) 0:00:20.388 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 31 August 2024 18:35:53 -0400 (0:00:00.738) 0:00:21.127 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 31 August 2024 18:35:54 -0400 (0:00:01.108) 0:00:22.236 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 31 August 2024 18:35:54 -0400 (0:00:00.074) 0:00:22.311 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 31 August 2024 18:35:54 -0400 (0:00:00.049) 0:00:22.360 ******* fatal: [managed_node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Saturday 31 August 2024 18:35:58 -0400 (0:00:04.026) 0:00:26.386 ******* fatal: [managed_node2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'mount_user': None, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'mount_group': None, u'type': u'disk', u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'foo' missing key/password"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 31 August 2024 18:35:58 -0400 (0:00:00.071) 0:00:26.458 ******* TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 31 August 2024 18:35:58 -0400 (0:00:00.048) 0:00:26.507 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 31 August 2024 18:35:58 -0400 (0:00:00.061) 0:00:26.569 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 31 August 2024 18:35:58 -0400 (0:00:00.093) 0:00:26.662 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:81 Saturday 31 August 2024 18:35:58 -0400 (0:00:00.052) 0:00:26.715 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 31 August 2024 18:35:59 -0400 (0:00:00.210) 0:00:26.926 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 31 August 2024 18:35:59 -0400 (0:00:00.164) 0:00:27.090 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 31 August 2024 18:35:59 -0400 (0:00:00.132) 0:00:27.223 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed_node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 31 August 2024 18:35:59 -0400 (0:00:00.170) 0:00:27.394 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 31 August 2024 18:35:59 -0400 (0:00:00.105) 0:00:27.499 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 31 August 2024 18:35:59 -0400 (0:00:00.054) 0:00:27.553 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 31 August 2024 18:35:59 -0400 (0:00:00.088) 0:00:27.642 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 31 August 2024 18:35:59 -0400 (0:00:00.070) 0:00:27.712 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 31 August 2024 18:36:00 -0400 (0:00:00.142) 0:00:27.855 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 31 August 2024 18:36:01 -0400 (0:00:01.468) 0:00:29.323 ******* ok: [managed_node2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 31 August 2024 18:36:01 -0400 (0:00:00.055) 0:00:29.379 ******* ok: [managed_node2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 31 August 2024 18:36:01 -0400 (0:00:00.061) 0:00:29.441 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 31 August 2024 18:36:05 -0400 (0:00:04.009) 0:00:33.450 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 31 August 2024 18:36:05 -0400 (0:00:00.188) 0:00:33.639 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 31 August 2024 18:36:05 -0400 (0:00:00.085) 0:00:33.724 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 31 August 2024 18:36:05 -0400 (0:00:00.072) 0:00:33.797 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 31 August 2024 18:36:06 -0400 (0:00:00.116) 0:00:33.914 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 31 August 2024 18:36:06 -0400 (0:00:00.861) 0:00:34.775 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 31 August 2024 18:36:08 -0400 (0:00:01.137) 0:00:35.912 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 31 August 2024 18:36:08 -0400 (0:00:00.099) 0:00:36.012 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 31 August 2024 18:36:08 -0400 (0:00:00.051) 0:00:36.064 ******* changed: [managed_node2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 31 August 2024 18:36:19 -0400 (0:00:10.778) 0:00:46.843 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 31 August 2024 18:36:19 -0400 (0:00:00.060) 0:00:46.903 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143712.4371896, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "72884e3f126482c2d28276ff7c57744fa95eff91", "ctime": 1725143711.9231806, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263550, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1725143711.9231806, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1229, "uid": 0, "version": "18446744071584693682", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 31 August 2024 18:36:19 -0400 (0:00:00.630) 0:00:47.534 ******* ok: [managed_node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 31 August 2024 18:36:20 -0400 (0:00:00.720) 0:00:48.254 ******* TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 31 August 2024 18:36:20 -0400 (0:00:00.050) 0:00:48.305 ******* ok: [managed_node2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 31 August 2024 18:36:20 -0400 (0:00:00.106) 0:00:48.412 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 31 August 2024 18:36:20 -0400 (0:00:00.103) 0:00:48.515 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 31 August 2024 18:36:20 -0400 (0:00:00.079) 0:00:48.595 ******* TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 31 August 2024 18:36:20 -0400 (0:00:00.054) 0:00:48.650 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 31 August 2024 18:36:21 -0400 (0:00:00.942) 0:00:49.592 ******* changed: [managed_node2] => (item={u'src': u'/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 31 August 2024 18:36:22 -0400 (0:00:00.658) 0:00:50.251 ******* skipping: [managed_node2] => (item={u'src': u'/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 31 August 2024 18:36:22 -0400 (0:00:00.067) 0:00:50.318 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 31 August 2024 18:36:23 -0400 (0:00:00.557) 0:00:50.876 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725142921.7026803, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1718879272.062, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131079, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1718879026.308, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744072852913879", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 31 August 2024 18:36:23 -0400 (0:00:00.403) 0:00:51.280 ******* changed: [managed_node2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 31 August 2024 18:36:23 -0400 (0:00:00.426) 0:00:51.706 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:93 Saturday 31 August 2024 18:36:24 -0400 (0:00:00.864) 0:00:52.571 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 31 August 2024 18:36:24 -0400 (0:00:00.182) 0:00:52.753 ******* skipping: [managed_node2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 31 August 2024 18:36:25 -0400 (0:00:00.067) 0:00:52.820 ******* ok: [managed_node2] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 31 August 2024 18:36:25 -0400 (0:00:00.116) 0:00:52.937 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "size": "10G", "type": "crypt", "uuid": "a9ca67dc-33c4-4d18-8275-ffddf5e24017" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "3a9b356b-ac1b-4da4-b00e-67c3a3d577c5" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 31 August 2024 18:36:25 -0400 (0:00:00.759) 0:00:53.697 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003287", "end": "2024-08-31 18:36:26.616185", "rc": 0, "start": "2024-08-31 18:36:26.612898" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 31 August 2024 18:36:26 -0400 (0:00:00.843) 0:00:54.540 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003100", "end": "2024-08-31 18:36:27.207936", "failed_when_result": false, "rc": 0, "start": "2024-08-31 18:36:27.204836" } STDOUT: luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 31 August 2024 18:36:27 -0400 (0:00:00.615) 0:00:55.156 ******* TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 31 August 2024 18:36:27 -0400 (0:00:00.099) 0:00:55.255 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed_node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 31 August 2024 18:36:27 -0400 (0:00:00.130) 0:00:55.385 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 31 August 2024 18:36:27 -0400 (0:00:00.059) 0:00:55.445 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed_node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 31 August 2024 18:36:27 -0400 (0:00:00.221) 0:00:55.666 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 31 August 2024 18:36:27 -0400 (0:00:00.038) 0:00:55.704 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 31 August 2024 18:36:27 -0400 (0:00:00.042) 0:00:55.746 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 31 August 2024 18:36:27 -0400 (0:00:00.033) 0:00:55.780 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 31 August 2024 18:36:28 -0400 (0:00:00.039) 0:00:55.820 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 31 August 2024 18:36:28 -0400 (0:00:00.042) 0:00:55.862 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 31 August 2024 18:36:28 -0400 (0:00:00.069) 0:00:55.932 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 31 August 2024 18:36:28 -0400 (0:00:00.050) 0:00:55.983 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 31 August 2024 18:36:28 -0400 (0:00:00.049) 0:00:56.032 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 31 August 2024 18:36:28 -0400 (0:00:00.051) 0:00:56.084 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 31 August 2024 18:36:28 -0400 (0:00:00.049) 0:00:56.133 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 31 August 2024 18:36:28 -0400 (0:00:00.050) 0:00:56.183 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 31 August 2024 18:36:28 -0400 (0:00:00.089) 0:00:56.273 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 31 August 2024 18:36:28 -0400 (0:00:00.061) 0:00:56.334 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 31 August 2024 18:36:28 -0400 (0:00:00.061) 0:00:56.395 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 31 August 2024 18:36:28 -0400 (0:00:00.052) 0:00:56.448 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 31 August 2024 18:36:28 -0400 (0:00:00.059) 0:00:56.507 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 31 August 2024 18:36:28 -0400 (0:00:00.052) 0:00:56.560 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 31 August 2024 18:36:28 -0400 (0:00:00.070) 0:00:56.630 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 31 August 2024 18:36:28 -0400 (0:00:00.074) 0:00:56.705 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143778.592351, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1725143778.592351, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 27840, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1725143778.592351, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 31 August 2024 18:36:29 -0400 (0:00:00.431) 0:00:57.137 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 31 August 2024 18:36:29 -0400 (0:00:00.064) 0:00:57.202 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 31 August 2024 18:36:29 -0400 (0:00:00.043) 0:00:57.245 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 31 August 2024 18:36:29 -0400 (0:00:00.050) 0:00:57.295 ******* ok: [managed_node2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 31 August 2024 18:36:29 -0400 (0:00:00.046) 0:00:57.342 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 31 August 2024 18:36:29 -0400 (0:00:00.041) 0:00:57.384 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 31 August 2024 18:36:29 -0400 (0:00:00.039) 0:00:57.424 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143778.7943544, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1725143778.7943544, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 167679, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1725143778.7943544, "nlink": 1, "path": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 31 August 2024 18:36:29 -0400 (0:00:00.369) 0:00:57.794 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 31 August 2024 18:36:30 -0400 (0:00:00.760) 0:00:58.554 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:02.507130", "end": "2024-08-31 18:36:33.573391", "rc": 0, "start": "2024-08-31 18:36:31.066261" } STDOUT: LUKS header information for /dev/sda Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: ed 2a e9 fc 36 8b a4 21 dd 46 76 c9 5b 53 57 fa ce b4 58 7a MK salt: c7 c9 9f 33 0c 0f 41 0a 15 e8 2b 6c eb bd ea e1 3b d3 b8 19 9c 63 61 27 12 3a 1a ca a7 f5 92 d2 MK iterations: 20608 UUID: 3a9b356b-ac1b-4da4-b00e-67c3a3d577c5 Key Slot 0: ENABLED Iterations: 329740 Salt: fc a4 f6 fb 6b 94 b3 0a 32 82 16 6b a2 c1 57 3d 62 3f 55 92 7f bb 31 a2 d1 c7 6b f6 b2 c2 fc 13 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 31 August 2024 18:36:33 -0400 (0:00:02.917) 0:01:01.471 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 31 August 2024 18:36:33 -0400 (0:00:00.063) 0:01:01.535 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 31 August 2024 18:36:33 -0400 (0:00:00.067) 0:01:01.603 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 31 August 2024 18:36:33 -0400 (0:00:00.064) 0:01:01.668 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 31 August 2024 18:36:33 -0400 (0:00:00.065) 0:01:01.733 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 31 August 2024 18:36:33 -0400 (0:00:00.080) 0:01:01.814 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 31 August 2024 18:36:34 -0400 (0:00:00.059) 0:01:01.873 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 31 August 2024 18:36:34 -0400 (0:00:00.075) 0:01:01.949 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 31 August 2024 18:36:34 -0400 (0:00:00.066) 0:01:02.015 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 31 August 2024 18:36:34 -0400 (0:00:00.063) 0:01:02.079 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 31 August 2024 18:36:34 -0400 (0:00:00.075) 0:01:02.154 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 31 August 2024 18:36:34 -0400 (0:00:00.073) 0:01:02.227 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 31 August 2024 18:36:34 -0400 (0:00:00.070) 0:01:02.298 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 31 August 2024 18:36:34 -0400 (0:00:00.080) 0:01:02.378 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 31 August 2024 18:36:34 -0400 (0:00:00.060) 0:01:02.439 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 31 August 2024 18:36:34 -0400 (0:00:00.047) 0:01:02.486 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 31 August 2024 18:36:34 -0400 (0:00:00.051) 0:01:02.537 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 31 August 2024 18:36:34 -0400 (0:00:00.051) 0:01:02.589 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 31 August 2024 18:36:34 -0400 (0:00:00.041) 0:01:02.631 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 31 August 2024 18:36:34 -0400 (0:00:00.032) 0:01:02.663 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 31 August 2024 18:36:34 -0400 (0:00:00.031) 0:01:02.694 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 31 August 2024 18:36:34 -0400 (0:00:00.033) 0:01:02.728 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 31 August 2024 18:36:34 -0400 (0:00:00.038) 0:01:02.767 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 31 August 2024 18:36:34 -0400 (0:00:00.041) 0:01:02.808 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 31 August 2024 18:36:35 -0400 (0:00:00.052) 0:01:02.861 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 31 August 2024 18:36:35 -0400 (0:00:00.065) 0:01:02.926 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 31 August 2024 18:36:35 -0400 (0:00:00.052) 0:01:02.978 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 31 August 2024 18:36:35 -0400 (0:00:00.071) 0:01:03.050 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 31 August 2024 18:36:35 -0400 (0:00:00.078) 0:01:03.128 ******* skipping: [managed_node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 31 August 2024 18:36:35 -0400 (0:00:00.073) 0:01:03.202 ******* skipping: [managed_node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 31 August 2024 18:36:35 -0400 (0:00:00.071) 0:01:03.273 ******* skipping: [managed_node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 31 August 2024 18:36:35 -0400 (0:00:00.061) 0:01:03.335 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 31 August 2024 18:36:35 -0400 (0:00:00.051) 0:01:03.387 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 31 August 2024 18:36:35 -0400 (0:00:00.055) 0:01:03.442 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 31 August 2024 18:36:35 -0400 (0:00:00.052) 0:01:03.495 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 31 August 2024 18:36:35 -0400 (0:00:00.052) 0:01:03.547 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 31 August 2024 18:36:35 -0400 (0:00:00.052) 0:01:03.600 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 31 August 2024 18:36:35 -0400 (0:00:00.104) 0:01:03.704 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 31 August 2024 18:36:35 -0400 (0:00:00.078) 0:01:03.782 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 31 August 2024 18:36:36 -0400 (0:00:00.061) 0:01:03.844 ******* skipping: [managed_node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 31 August 2024 18:36:36 -0400 (0:00:00.056) 0:01:03.901 ******* skipping: [managed_node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 31 August 2024 18:36:36 -0400 (0:00:00.052) 0:01:03.953 ******* skipping: [managed_node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 31 August 2024 18:36:36 -0400 (0:00:00.071) 0:01:04.025 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 31 August 2024 18:36:36 -0400 (0:00:00.094) 0:01:04.119 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 31 August 2024 18:36:36 -0400 (0:00:00.060) 0:01:04.180 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 31 August 2024 18:36:36 -0400 (0:00:00.090) 0:01:04.270 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 31 August 2024 18:36:36 -0400 (0:00:00.066) 0:01:04.337 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 31 August 2024 18:36:36 -0400 (0:00:00.087) 0:01:04.425 ******* ok: [managed_node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 31 August 2024 18:36:36 -0400 (0:00:00.061) 0:01:04.487 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 31 August 2024 18:36:36 -0400 (0:00:00.058) 0:01:04.545 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 31 August 2024 18:36:36 -0400 (0:00:00.092) 0:01:04.638 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 31 August 2024 18:36:36 -0400 (0:00:00.149) 0:01:04.787 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 31 August 2024 18:36:37 -0400 (0:00:00.103) 0:01:04.891 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 31 August 2024 18:36:37 -0400 (0:00:00.060) 0:01:04.951 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 31 August 2024 18:36:37 -0400 (0:00:00.052) 0:01:05.004 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 31 August 2024 18:36:37 -0400 (0:00:00.053) 0:01:05.058 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 31 August 2024 18:36:37 -0400 (0:00:00.067) 0:01:05.126 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 31 August 2024 18:36:37 -0400 (0:00:00.068) 0:01:05.194 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 31 August 2024 18:36:37 -0400 (0:00:00.063) 0:01:05.257 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 31 August 2024 18:36:37 -0400 (0:00:00.052) 0:01:05.310 ******* changed: [managed_node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:99 Saturday 31 August 2024 18:36:38 -0400 (0:00:00.696) 0:01:06.007 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed_node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 31 August 2024 18:36:38 -0400 (0:00:00.105) 0:01:06.113 ******* ok: [managed_node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 31 August 2024 18:36:38 -0400 (0:00:00.046) 0:01:06.159 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 31 August 2024 18:36:38 -0400 (0:00:00.105) 0:01:06.265 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 31 August 2024 18:36:38 -0400 (0:00:00.053) 0:01:06.319 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 31 August 2024 18:36:38 -0400 (0:00:00.042) 0:01:06.362 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed_node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 31 August 2024 18:36:38 -0400 (0:00:00.117) 0:01:06.479 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 31 August 2024 18:36:38 -0400 (0:00:00.062) 0:01:06.542 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 31 August 2024 18:36:38 -0400 (0:00:00.048) 0:01:06.590 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 31 August 2024 18:36:38 -0400 (0:00:00.049) 0:01:06.639 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 31 August 2024 18:36:38 -0400 (0:00:00.049) 0:01:06.689 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 31 August 2024 18:36:38 -0400 (0:00:00.125) 0:01:06.814 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 31 August 2024 18:36:40 -0400 (0:00:01.560) 0:01:08.375 ******* ok: [managed_node2] => { "storage_pools": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 31 August 2024 18:36:40 -0400 (0:00:00.062) 0:01:08.438 ******* ok: [managed_node2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 31 August 2024 18:36:40 -0400 (0:00:00.067) 0:01:08.505 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 31 August 2024 18:36:45 -0400 (0:00:04.331) 0:01:12.837 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 31 August 2024 18:36:45 -0400 (0:00:00.095) 0:01:12.932 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 31 August 2024 18:36:45 -0400 (0:00:00.049) 0:01:12.982 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 31 August 2024 18:36:45 -0400 (0:00:00.051) 0:01:13.034 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 31 August 2024 18:36:45 -0400 (0:00:00.049) 0:01:13.084 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 31 August 2024 18:36:46 -0400 (0:00:00.732) 0:01:13.816 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 31 August 2024 18:36:47 -0400 (0:00:01.081) 0:01:14.898 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 31 August 2024 18:36:47 -0400 (0:00:00.074) 0:01:14.972 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 31 August 2024 18:36:47 -0400 (0:00:00.053) 0:01:15.025 ******* fatal: [managed_node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Saturday 31 August 2024 18:36:51 -0400 (0:00:04.031) 0:01:19.057 ******* fatal: [managed_node2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10733223936, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'fs_overwrite_existing': True, u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'cache_size': 0, u'mount_user': None, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'mount_group': None, u'type': u'disk', u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5' in safe mode due to encryption removal"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 31 August 2024 18:36:51 -0400 (0:00:00.053) 0:01:19.111 ******* TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 31 August 2024 18:36:51 -0400 (0:00:00.042) 0:01:19.153 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 31 August 2024 18:36:51 -0400 (0:00:00.054) 0:01:19.208 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 31 August 2024 18:36:51 -0400 (0:00:00.074) 0:01:19.282 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 31 August 2024 18:36:51 -0400 (0:00:00.052) 0:01:19.335 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143798.1056936, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1725143798.1056936, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1725143798.1056936, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "268371773", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 31 August 2024 18:36:52 -0400 (0:00:00.536) 0:01:19.871 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:119 Saturday 31 August 2024 18:36:52 -0400 (0:00:00.076) 0:01:19.948 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 31 August 2024 18:36:52 -0400 (0:00:00.245) 0:01:20.194 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 31 August 2024 18:36:52 -0400 (0:00:00.075) 0:01:20.269 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 31 August 2024 18:36:52 -0400 (0:00:00.053) 0:01:20.323 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed_node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 31 August 2024 18:36:52 -0400 (0:00:00.123) 0:01:20.447 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 31 August 2024 18:36:52 -0400 (0:00:00.044) 0:01:20.491 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 31 August 2024 18:36:52 -0400 (0:00:00.051) 0:01:20.542 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 31 August 2024 18:36:52 -0400 (0:00:00.051) 0:01:20.594 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 31 August 2024 18:36:52 -0400 (0:00:00.054) 0:01:20.649 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 31 August 2024 18:36:53 -0400 (0:00:00.189) 0:01:20.838 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 31 August 2024 18:36:54 -0400 (0:00:01.643) 0:01:22.482 ******* ok: [managed_node2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 31 August 2024 18:36:54 -0400 (0:00:00.120) 0:01:22.603 ******* ok: [managed_node2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 31 August 2024 18:36:54 -0400 (0:00:00.098) 0:01:22.702 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 31 August 2024 18:36:58 -0400 (0:00:04.030) 0:01:26.732 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 31 August 2024 18:36:58 -0400 (0:00:00.077) 0:01:26.810 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 31 August 2024 18:36:59 -0400 (0:00:00.030) 0:01:26.841 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 31 August 2024 18:36:59 -0400 (0:00:00.033) 0:01:26.874 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 31 August 2024 18:36:59 -0400 (0:00:00.030) 0:01:26.905 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 31 August 2024 18:36:59 -0400 (0:00:00.728) 0:01:27.633 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 31 August 2024 18:37:00 -0400 (0:00:01.177) 0:01:28.810 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 31 August 2024 18:37:01 -0400 (0:00:00.073) 0:01:28.884 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 31 August 2024 18:37:01 -0400 (0:00:00.051) 0:01:28.935 ******* changed: [managed_node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=69b339e4-2a2d-4e01-95fb-af566cf1496f", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=69b339e4-2a2d-4e01-95fb-af566cf1496f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 31 August 2024 18:37:05 -0400 (0:00:04.740) 0:01:33.675 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 31 August 2024 18:37:05 -0400 (0:00:00.055) 0:01:33.730 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143782.3244164, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "091c061e738974d2f6873c897d399989d17dcf6b", "ctime": 1725143782.3214164, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263550, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1725143782.3214164, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744071584693682", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 31 August 2024 18:37:06 -0400 (0:00:00.471) 0:01:34.202 ******* ok: [managed_node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 31 August 2024 18:37:06 -0400 (0:00:00.524) 0:01:34.726 ******* TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 31 August 2024 18:37:06 -0400 (0:00:00.049) 0:01:34.775 ******* ok: [managed_node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=69b339e4-2a2d-4e01-95fb-af566cf1496f", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=69b339e4-2a2d-4e01-95fb-af566cf1496f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 31 August 2024 18:37:07 -0400 (0:00:00.084) 0:01:34.860 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 31 August 2024 18:37:07 -0400 (0:00:00.058) 0:01:34.919 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=69b339e4-2a2d-4e01-95fb-af566cf1496f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 31 August 2024 18:37:07 -0400 (0:00:00.064) 0:01:34.983 ******* changed: [managed_node2] => (item={u'src': u'/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 31 August 2024 18:37:07 -0400 (0:00:00.367) 0:01:35.351 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 31 August 2024 18:37:08 -0400 (0:00:00.570) 0:01:35.921 ******* changed: [managed_node2] => (item={u'src': u'UUID=69b339e4-2a2d-4e01-95fb-af566cf1496f', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=69b339e4-2a2d-4e01-95fb-af566cf1496f", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=69b339e4-2a2d-4e01-95fb-af566cf1496f" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 31 August 2024 18:37:08 -0400 (0:00:00.594) 0:01:36.516 ******* skipping: [managed_node2] => (item={u'src': u'UUID=69b339e4-2a2d-4e01-95fb-af566cf1496f', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=69b339e4-2a2d-4e01-95fb-af566cf1496f", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 31 August 2024 18:37:08 -0400 (0:00:00.068) 0:01:36.584 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 31 August 2024 18:37:09 -0400 (0:00:00.560) 0:01:37.145 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143787.2065022, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "6733baff996614ab8a7e3a48fd2f55c58335e6d5", "ctime": 1725143783.7934422, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 917510, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1725143783.7924423, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "18446744071584695966", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 31 August 2024 18:37:09 -0400 (0:00:00.353) 0:01:37.498 ******* changed: [managed_node2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 31 August 2024 18:37:10 -0400 (0:00:00.407) 0:01:37.906 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:132 Saturday 31 August 2024 18:37:11 -0400 (0:00:00.987) 0:01:38.894 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 31 August 2024 18:37:11 -0400 (0:00:00.128) 0:01:39.022 ******* skipping: [managed_node2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 31 August 2024 18:37:11 -0400 (0:00:00.045) 0:01:39.067 ******* ok: [managed_node2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=69b339e4-2a2d-4e01-95fb-af566cf1496f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 31 August 2024 18:37:11 -0400 (0:00:00.051) 0:01:39.118 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "69b339e4-2a2d-4e01-95fb-af566cf1496f" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 31 August 2024 18:37:11 -0400 (0:00:00.382) 0:01:39.501 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003353", "end": "2024-08-31 18:37:11.979646", "rc": 0, "start": "2024-08-31 18:37:11.976293" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=69b339e4-2a2d-4e01-95fb-af566cf1496f /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 31 August 2024 18:37:12 -0400 (0:00:00.396) 0:01:39.897 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003256", "end": "2024-08-31 18:37:12.610758", "failed_when_result": false, "rc": 0, "start": "2024-08-31 18:37:12.607502" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 31 August 2024 18:37:12 -0400 (0:00:00.707) 0:01:40.605 ******* TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 31 August 2024 18:37:12 -0400 (0:00:00.087) 0:01:40.693 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed_node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 31 August 2024 18:37:12 -0400 (0:00:00.117) 0:01:40.810 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 31 August 2024 18:37:13 -0400 (0:00:00.099) 0:01:40.909 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed_node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 31 August 2024 18:37:13 -0400 (0:00:00.409) 0:01:41.319 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 31 August 2024 18:37:13 -0400 (0:00:00.067) 0:01:41.387 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 31 August 2024 18:37:13 -0400 (0:00:00.083) 0:01:41.470 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 31 August 2024 18:37:13 -0400 (0:00:00.076) 0:01:41.547 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 31 August 2024 18:37:13 -0400 (0:00:00.119) 0:01:41.667 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 31 August 2024 18:37:13 -0400 (0:00:00.061) 0:01:41.728 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 31 August 2024 18:37:13 -0400 (0:00:00.050) 0:01:41.779 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 31 August 2024 18:37:14 -0400 (0:00:00.048) 0:01:41.827 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 31 August 2024 18:37:14 -0400 (0:00:00.047) 0:01:41.875 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 31 August 2024 18:37:14 -0400 (0:00:00.062) 0:01:41.938 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 31 August 2024 18:37:14 -0400 (0:00:00.051) 0:01:41.990 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 31 August 2024 18:37:14 -0400 (0:00:00.051) 0:01:42.041 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=69b339e4-2a2d-4e01-95fb-af566cf1496f " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 31 August 2024 18:37:14 -0400 (0:00:00.105) 0:01:42.146 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 31 August 2024 18:37:14 -0400 (0:00:00.109) 0:01:42.255 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 31 August 2024 18:37:14 -0400 (0:00:00.100) 0:01:42.356 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 31 August 2024 18:37:14 -0400 (0:00:00.119) 0:01:42.475 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 31 August 2024 18:37:14 -0400 (0:00:00.139) 0:01:42.615 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 31 August 2024 18:37:14 -0400 (0:00:00.120) 0:01:42.736 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 31 August 2024 18:37:15 -0400 (0:00:00.145) 0:01:42.882 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 31 August 2024 18:37:15 -0400 (0:00:00.085) 0:01:42.967 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143825.6881778, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1725143825.6881778, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 27840, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1725143825.6881778, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 31 August 2024 18:37:15 -0400 (0:00:00.492) 0:01:43.459 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 31 August 2024 18:37:15 -0400 (0:00:00.088) 0:01:43.548 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 31 August 2024 18:37:15 -0400 (0:00:00.052) 0:01:43.600 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 31 August 2024 18:37:15 -0400 (0:00:00.080) 0:01:43.681 ******* ok: [managed_node2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 31 August 2024 18:37:15 -0400 (0:00:00.084) 0:01:43.766 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 31 August 2024 18:37:16 -0400 (0:00:00.059) 0:01:43.826 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 31 August 2024 18:37:16 -0400 (0:00:00.125) 0:01:43.951 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 31 August 2024 18:37:16 -0400 (0:00:00.112) 0:01:44.064 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 31 August 2024 18:37:17 -0400 (0:00:00.906) 0:01:44.970 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 31 August 2024 18:37:17 -0400 (0:00:00.040) 0:01:45.010 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 31 August 2024 18:37:17 -0400 (0:00:00.040) 0:01:45.051 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 31 August 2024 18:37:17 -0400 (0:00:00.047) 0:01:45.099 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 31 August 2024 18:37:17 -0400 (0:00:00.034) 0:01:45.133 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 31 August 2024 18:37:17 -0400 (0:00:00.048) 0:01:45.181 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 31 August 2024 18:37:17 -0400 (0:00:00.059) 0:01:45.240 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 31 August 2024 18:37:17 -0400 (0:00:00.049) 0:01:45.290 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 31 August 2024 18:37:17 -0400 (0:00:00.051) 0:01:45.341 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 31 August 2024 18:37:17 -0400 (0:00:00.061) 0:01:45.403 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 31 August 2024 18:37:17 -0400 (0:00:00.057) 0:01:45.461 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 31 August 2024 18:37:17 -0400 (0:00:00.048) 0:01:45.510 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 31 August 2024 18:37:17 -0400 (0:00:00.052) 0:01:45.562 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 31 August 2024 18:37:17 -0400 (0:00:00.052) 0:01:45.615 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 31 August 2024 18:37:17 -0400 (0:00:00.052) 0:01:45.667 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 31 August 2024 18:37:17 -0400 (0:00:00.071) 0:01:45.739 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 31 August 2024 18:37:18 -0400 (0:00:00.082) 0:01:45.821 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 31 August 2024 18:37:18 -0400 (0:00:00.051) 0:01:45.872 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 31 August 2024 18:37:18 -0400 (0:00:00.073) 0:01:45.946 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 31 August 2024 18:37:18 -0400 (0:00:00.084) 0:01:46.030 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 31 August 2024 18:37:18 -0400 (0:00:00.049) 0:01:46.080 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 31 August 2024 18:37:18 -0400 (0:00:00.057) 0:01:46.137 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 31 August 2024 18:37:18 -0400 (0:00:00.064) 0:01:46.202 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 31 August 2024 18:37:18 -0400 (0:00:00.060) 0:01:46.262 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 31 August 2024 18:37:18 -0400 (0:00:00.059) 0:01:46.322 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 31 August 2024 18:37:18 -0400 (0:00:00.047) 0:01:46.369 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 31 August 2024 18:37:18 -0400 (0:00:00.043) 0:01:46.412 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 31 August 2024 18:37:18 -0400 (0:00:00.039) 0:01:46.452 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 31 August 2024 18:37:18 -0400 (0:00:00.037) 0:01:46.489 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 31 August 2024 18:37:18 -0400 (0:00:00.035) 0:01:46.525 ******* skipping: [managed_node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 31 August 2024 18:37:18 -0400 (0:00:00.034) 0:01:46.559 ******* skipping: [managed_node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 31 August 2024 18:37:18 -0400 (0:00:00.042) 0:01:46.602 ******* skipping: [managed_node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 31 August 2024 18:37:18 -0400 (0:00:00.044) 0:01:46.647 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 31 August 2024 18:37:18 -0400 (0:00:00.051) 0:01:46.698 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 31 August 2024 18:37:18 -0400 (0:00:00.061) 0:01:46.760 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 31 August 2024 18:37:19 -0400 (0:00:00.113) 0:01:46.873 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 31 August 2024 18:37:19 -0400 (0:00:00.074) 0:01:46.948 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 31 August 2024 18:37:19 -0400 (0:00:00.113) 0:01:47.061 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 31 August 2024 18:37:19 -0400 (0:00:00.122) 0:01:47.184 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 31 August 2024 18:37:19 -0400 (0:00:00.053) 0:01:47.237 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 31 August 2024 18:37:19 -0400 (0:00:00.075) 0:01:47.313 ******* skipping: [managed_node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 31 August 2024 18:37:19 -0400 (0:00:00.050) 0:01:47.363 ******* skipping: [managed_node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 31 August 2024 18:37:19 -0400 (0:00:00.051) 0:01:47.414 ******* skipping: [managed_node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 31 August 2024 18:37:19 -0400 (0:00:00.047) 0:01:47.462 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 31 August 2024 18:37:19 -0400 (0:00:00.055) 0:01:47.518 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 31 August 2024 18:37:19 -0400 (0:00:00.061) 0:01:47.579 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 31 August 2024 18:37:19 -0400 (0:00:00.047) 0:01:47.627 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 31 August 2024 18:37:19 -0400 (0:00:00.047) 0:01:47.674 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 31 August 2024 18:37:19 -0400 (0:00:00.062) 0:01:47.737 ******* ok: [managed_node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 31 August 2024 18:37:19 -0400 (0:00:00.067) 0:01:47.804 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 31 August 2024 18:37:20 -0400 (0:00:00.061) 0:01:47.865 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 31 August 2024 18:37:20 -0400 (0:00:00.057) 0:01:47.923 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 31 August 2024 18:37:20 -0400 (0:00:00.050) 0:01:47.973 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 31 August 2024 18:37:20 -0400 (0:00:00.049) 0:01:48.023 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 31 August 2024 18:37:20 -0400 (0:00:00.048) 0:01:48.071 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 31 August 2024 18:37:20 -0400 (0:00:00.049) 0:01:48.121 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 31 August 2024 18:37:20 -0400 (0:00:00.052) 0:01:48.173 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 31 August 2024 18:37:20 -0400 (0:00:00.057) 0:01:48.231 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 31 August 2024 18:37:20 -0400 (0:00:00.046) 0:01:48.278 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 31 August 2024 18:37:20 -0400 (0:00:00.068) 0:01:48.347 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 31 August 2024 18:37:20 -0400 (0:00:00.063) 0:01:48.410 ******* changed: [managed_node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:138 Saturday 31 August 2024 18:37:21 -0400 (0:00:00.461) 0:01:48.871 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed_node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 31 August 2024 18:37:21 -0400 (0:00:00.095) 0:01:48.967 ******* ok: [managed_node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 31 August 2024 18:37:21 -0400 (0:00:00.052) 0:01:49.019 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 31 August 2024 18:37:21 -0400 (0:00:00.054) 0:01:49.073 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 31 August 2024 18:37:21 -0400 (0:00:00.051) 0:01:49.125 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 31 August 2024 18:37:21 -0400 (0:00:00.048) 0:01:49.174 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed_node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 31 August 2024 18:37:21 -0400 (0:00:00.121) 0:01:49.296 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 31 August 2024 18:37:21 -0400 (0:00:00.048) 0:01:49.344 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 31 August 2024 18:37:21 -0400 (0:00:00.064) 0:01:49.409 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 31 August 2024 18:37:21 -0400 (0:00:00.074) 0:01:49.483 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 31 August 2024 18:37:21 -0400 (0:00:00.077) 0:01:49.561 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 31 August 2024 18:37:21 -0400 (0:00:00.143) 0:01:49.704 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 31 August 2024 18:37:23 -0400 (0:00:01.452) 0:01:51.156 ******* ok: [managed_node2] => { "storage_pools": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 31 August 2024 18:37:23 -0400 (0:00:00.052) 0:01:51.209 ******* ok: [managed_node2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 31 August 2024 18:37:23 -0400 (0:00:00.055) 0:01:51.265 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 31 August 2024 18:37:27 -0400 (0:00:04.098) 0:01:55.363 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 31 August 2024 18:37:27 -0400 (0:00:00.068) 0:01:55.432 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 31 August 2024 18:37:27 -0400 (0:00:00.031) 0:01:55.463 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 31 August 2024 18:37:27 -0400 (0:00:00.032) 0:01:55.495 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 31 August 2024 18:37:27 -0400 (0:00:00.029) 0:01:55.524 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 31 August 2024 18:37:28 -0400 (0:00:00.642) 0:01:56.167 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d3a9b356b\\x2dac1b\\x2d4da4\\x2db00e\\x2d67c3a3d577c5.service": { "name": "systemd-cryptsetup@luks\\x2d3a9b356b\\x2dac1b\\x2d4da4\\x2db00e\\x2d67c3a3d577c5.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 31 August 2024 18:37:29 -0400 (0:00:01.044) 0:01:57.212 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d3a9b356b\\x2dac1b\\x2d4da4\\x2db00e\\x2d67c3a3d577c5.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 31 August 2024 18:37:29 -0400 (0:00:00.058) 0:01:57.271 ******* changed: [managed_node2] => (item=systemd-cryptsetup@luks\x2d3a9b356b\x2dac1b\x2d4da4\x2db00e\x2d67c3a3d577c5.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d3a9b356b\\x2dac1b\\x2d4da4\\x2db00e\\x2d67c3a3d577c5.service", "name": "systemd-cryptsetup@luks\\x2d3a9b356b\\x2dac1b\\x2d4da4\\x2db00e\\x2d67c3a3d577c5.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target dev-sda.device systemd-readahead-collect.service systemd-journald.socket systemd-readahead-replay.service system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-3a9b356b-ac1b-4da4-b00e-67c3a3d577c5 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d3a9b356b\\x2dac1b\\x2d4da4\\x2db00e\\x2d67c3a3d577c5.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d3a9b356b\\x2dac1b\\x2d4da4\\x2db00e\\x2d67c3a3d577c5.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d3a9b356b\\x2dac1b\\x2d4da4\\x2db00e\\x2d67c3a3d577c5.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 31 August 2024 18:37:30 -0400 (0:00:00.563) 0:01:57.834 ******* fatal: [managed_node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Saturday 31 August 2024 18:37:33 -0400 (0:00:03.826) 0:02:01.661 ******* fatal: [managed_node2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': True, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'mount_user': None, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'mount_group': None, u'type': u'disk', u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'sda' in safe mode due to adding encryption"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 31 August 2024 18:37:33 -0400 (0:00:00.070) 0:02:01.732 ******* changed: [managed_node2] => (item=systemd-cryptsetup@luks\x2d3a9b356b\x2dac1b\x2d4da4\x2db00e\x2d67c3a3d577c5.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d3a9b356b\\x2dac1b\\x2d4da4\\x2db00e\\x2d67c3a3d577c5.service", "name": "systemd-cryptsetup@luks\\x2d3a9b356b\\x2dac1b\\x2d4da4\\x2db00e\\x2d67c3a3d577c5.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d3a9b356b\\x2dac1b\\x2d4da4\\x2db00e\\x2d67c3a3d577c5.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d3a9b356b\\x2dac1b\\x2d4da4\\x2db00e\\x2d67c3a3d577c5.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d3a9b356b\\x2dac1b\\x2d4da4\\x2db00e\\x2d67c3a3d577c5.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 31 August 2024 18:37:34 -0400 (0:00:00.743) 0:02:02.475 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 31 August 2024 18:37:34 -0400 (0:00:00.060) 0:02:02.535 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 31 August 2024 18:37:34 -0400 (0:00:00.075) 0:02:02.610 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 31 August 2024 18:37:34 -0400 (0:00:00.054) 0:02:02.665 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143840.9784467, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1725143840.9784467, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1725143840.9784467, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744072993298787", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 31 August 2024 18:37:35 -0400 (0:00:00.409) 0:02:03.074 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:158 Saturday 31 August 2024 18:37:35 -0400 (0:00:00.081) 0:02:03.156 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 31 August 2024 18:37:35 -0400 (0:00:00.191) 0:02:03.347 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 31 August 2024 18:37:35 -0400 (0:00:00.083) 0:02:03.430 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 31 August 2024 18:37:35 -0400 (0:00:00.063) 0:02:03.494 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed_node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 31 August 2024 18:37:35 -0400 (0:00:00.111) 0:02:03.605 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 31 August 2024 18:37:35 -0400 (0:00:00.049) 0:02:03.654 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 31 August 2024 18:37:35 -0400 (0:00:00.055) 0:02:03.710 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 31 August 2024 18:37:35 -0400 (0:00:00.051) 0:02:03.761 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 31 August 2024 18:37:35 -0400 (0:00:00.047) 0:02:03.809 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 31 August 2024 18:37:36 -0400 (0:00:00.191) 0:02:04.001 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 31 August 2024 18:37:37 -0400 (0:00:01.523) 0:02:05.525 ******* ok: [managed_node2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 31 August 2024 18:37:37 -0400 (0:00:00.128) 0:02:05.653 ******* ok: [managed_node2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 31 August 2024 18:37:37 -0400 (0:00:00.126) 0:02:05.780 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 31 August 2024 18:37:42 -0400 (0:00:04.256) 0:02:10.036 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 31 August 2024 18:37:42 -0400 (0:00:00.156) 0:02:10.192 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 31 August 2024 18:37:42 -0400 (0:00:00.086) 0:02:10.278 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 31 August 2024 18:37:42 -0400 (0:00:00.086) 0:02:10.364 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 31 August 2024 18:37:42 -0400 (0:00:00.099) 0:02:10.464 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 31 August 2024 18:37:43 -0400 (0:00:01.098) 0:02:11.563 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 31 August 2024 18:37:44 -0400 (0:00:01.206) 0:02:12.769 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 31 August 2024 18:37:45 -0400 (0:00:00.080) 0:02:12.850 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 31 August 2024 18:37:45 -0400 (0:00:00.108) 0:02:12.958 ******* changed: [managed_node2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=69b339e4-2a2d-4e01-95fb-af566cf1496f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 31 August 2024 18:37:55 -0400 (0:00:10.633) 0:02:23.591 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 31 August 2024 18:37:55 -0400 (0:00:00.058) 0:02:23.650 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143828.5552282, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "306f86ad13586cf865e53249f1833d2f75110061", "ctime": 1725143828.5522282, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263550, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1725143828.5522282, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1299, "uid": 0, "version": "18446744071584693682", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 31 August 2024 18:37:56 -0400 (0:00:00.406) 0:02:24.057 ******* ok: [managed_node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 31 August 2024 18:37:56 -0400 (0:00:00.363) 0:02:24.420 ******* TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 31 August 2024 18:37:56 -0400 (0:00:00.029) 0:02:24.449 ******* ok: [managed_node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=69b339e4-2a2d-4e01-95fb-af566cf1496f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 31 August 2024 18:37:56 -0400 (0:00:00.053) 0:02:24.503 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 31 August 2024 18:37:56 -0400 (0:00:00.063) 0:02:24.566 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 31 August 2024 18:37:56 -0400 (0:00:00.061) 0:02:24.628 ******* changed: [managed_node2] => (item={u'src': u'UUID=69b339e4-2a2d-4e01-95fb-af566cf1496f', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=69b339e4-2a2d-4e01-95fb-af566cf1496f", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=69b339e4-2a2d-4e01-95fb-af566cf1496f" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 31 August 2024 18:37:57 -0400 (0:00:00.445) 0:02:25.074 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 31 August 2024 18:37:57 -0400 (0:00:00.583) 0:02:25.657 ******* changed: [managed_node2] => (item={u'src': u'/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 31 August 2024 18:37:58 -0400 (0:00:00.482) 0:02:26.139 ******* skipping: [managed_node2] => (item={u'src': u'/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 31 August 2024 18:37:58 -0400 (0:00:00.064) 0:02:26.204 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 31 August 2024 18:37:58 -0400 (0:00:00.530) 0:02:26.735 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143832.6102996, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1725143829.967253, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 917511, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1725143829.966253, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744071584696124", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 31 August 2024 18:37:59 -0400 (0:00:00.406) 0:02:27.142 ******* changed: [managed_node2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 31 August 2024 18:37:59 -0400 (0:00:00.602) 0:02:27.744 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:171 Saturday 31 August 2024 18:38:00 -0400 (0:00:01.020) 0:02:28.765 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 31 August 2024 18:38:01 -0400 (0:00:00.124) 0:02:28.889 ******* skipping: [managed_node2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 31 August 2024 18:38:01 -0400 (0:00:00.051) 0:02:28.941 ******* ok: [managed_node2] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 31 August 2024 18:38:01 -0400 (0:00:00.115) 0:02:29.056 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "size": "10G", "type": "crypt", "uuid": "a3cccafb-ee00-4d43-9da1-c9fadafd4bb9" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "6dc95608-1152-4f68-9e9a-bf366f9ae1bd" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 31 August 2024 18:38:01 -0400 (0:00:00.398) 0:02:29.455 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003652", "end": "2024-08-31 18:38:01.996521", "rc": 0, "start": "2024-08-31 18:38:01.992869" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 31 August 2024 18:38:02 -0400 (0:00:00.446) 0:02:29.902 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003287", "end": "2024-08-31 18:38:02.395544", "failed_when_result": false, "rc": 0, "start": "2024-08-31 18:38:02.392257" } STDOUT: luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 31 August 2024 18:38:02 -0400 (0:00:00.387) 0:02:30.290 ******* TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 31 August 2024 18:38:02 -0400 (0:00:00.050) 0:02:30.340 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed_node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 31 August 2024 18:38:02 -0400 (0:00:00.112) 0:02:30.453 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 31 August 2024 18:38:02 -0400 (0:00:00.064) 0:02:30.517 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed_node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 31 August 2024 18:38:02 -0400 (0:00:00.247) 0:02:30.765 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 31 August 2024 18:38:03 -0400 (0:00:00.058) 0:02:30.824 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 31 August 2024 18:38:03 -0400 (0:00:00.066) 0:02:30.890 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 31 August 2024 18:38:03 -0400 (0:00:00.054) 0:02:30.944 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 31 August 2024 18:38:03 -0400 (0:00:00.061) 0:02:31.006 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 31 August 2024 18:38:03 -0400 (0:00:00.052) 0:02:31.058 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 31 August 2024 18:38:03 -0400 (0:00:00.050) 0:02:31.109 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 31 August 2024 18:38:03 -0400 (0:00:00.052) 0:02:31.161 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 31 August 2024 18:38:03 -0400 (0:00:00.058) 0:02:31.219 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 31 August 2024 18:38:03 -0400 (0:00:00.054) 0:02:31.274 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 31 August 2024 18:38:03 -0400 (0:00:00.053) 0:02:31.328 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 31 August 2024 18:38:03 -0400 (0:00:00.064) 0:02:31.392 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 31 August 2024 18:38:03 -0400 (0:00:00.088) 0:02:31.480 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 31 August 2024 18:38:03 -0400 (0:00:00.068) 0:02:31.549 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 31 August 2024 18:38:03 -0400 (0:00:00.067) 0:02:31.616 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 31 August 2024 18:38:03 -0400 (0:00:00.053) 0:02:31.670 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 31 August 2024 18:38:03 -0400 (0:00:00.065) 0:02:31.735 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 31 August 2024 18:38:03 -0400 (0:00:00.052) 0:02:31.787 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 31 August 2024 18:38:04 -0400 (0:00:00.066) 0:02:31.854 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 31 August 2024 18:38:04 -0400 (0:00:00.072) 0:02:31.926 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143875.4910536, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1725143875.4910536, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 27840, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1725143875.4910536, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 31 August 2024 18:38:04 -0400 (0:00:00.395) 0:02:32.322 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 31 August 2024 18:38:04 -0400 (0:00:00.061) 0:02:32.383 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 31 August 2024 18:38:04 -0400 (0:00:00.050) 0:02:32.434 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 31 August 2024 18:38:04 -0400 (0:00:00.062) 0:02:32.497 ******* ok: [managed_node2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 31 August 2024 18:38:04 -0400 (0:00:00.102) 0:02:32.600 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 31 August 2024 18:38:04 -0400 (0:00:00.086) 0:02:32.686 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 31 August 2024 18:38:04 -0400 (0:00:00.072) 0:02:32.759 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143875.6420562, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1725143875.6420562, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 186050, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1725143875.6420562, "nlink": 1, "path": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 31 August 2024 18:38:05 -0400 (0:00:00.501) 0:02:33.261 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 31 August 2024 18:38:06 -0400 (0:00:00.994) 0:02:34.255 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.027706", "end": "2024-08-31 18:38:06.828668", "rc": 0, "start": "2024-08-31 18:38:06.800962" } STDOUT: LUKS header information for /dev/sda Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 55 8c 90 75 e3 97 40 d0 6a da b4 b8 ba 38 ec 22 75 d4 3f e1 MK salt: f8 06 05 8e 98 62 09 5c 09 b4 49 d9 2a dd 7c 2e fd 2c a5 fe 4c ed 47 cd 4f c1 0e 7e 13 dc 55 45 MK iterations: 20634 UUID: 6dc95608-1152-4f68-9e9a-bf366f9ae1bd Key Slot 0: ENABLED Iterations: 329326 Salt: 24 69 34 bd 26 16 3d 60 4b 9b b6 8a 50 8b 92 66 4c 05 4d 63 51 2a 2c ae 36 a1 01 4e bf 79 50 d4 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 31 August 2024 18:38:06 -0400 (0:00:00.511) 0:02:34.766 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 31 August 2024 18:38:07 -0400 (0:00:00.067) 0:02:34.834 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 31 August 2024 18:38:07 -0400 (0:00:00.091) 0:02:34.925 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 31 August 2024 18:38:07 -0400 (0:00:00.065) 0:02:34.991 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 31 August 2024 18:38:07 -0400 (0:00:00.132) 0:02:35.124 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 31 August 2024 18:38:07 -0400 (0:00:00.080) 0:02:35.204 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 31 August 2024 18:38:07 -0400 (0:00:00.106) 0:02:35.310 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 31 August 2024 18:38:07 -0400 (0:00:00.078) 0:02:35.389 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 31 August 2024 18:38:07 -0400 (0:00:00.075) 0:02:35.465 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 31 August 2024 18:38:07 -0400 (0:00:00.060) 0:02:35.525 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 31 August 2024 18:38:07 -0400 (0:00:00.055) 0:02:35.581 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 31 August 2024 18:38:07 -0400 (0:00:00.042) 0:02:35.624 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 31 August 2024 18:38:07 -0400 (0:00:00.042) 0:02:35.666 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 31 August 2024 18:38:07 -0400 (0:00:00.035) 0:02:35.702 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 31 August 2024 18:38:07 -0400 (0:00:00.040) 0:02:35.743 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 31 August 2024 18:38:07 -0400 (0:00:00.048) 0:02:35.791 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 31 August 2024 18:38:08 -0400 (0:00:00.049) 0:02:35.841 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 31 August 2024 18:38:08 -0400 (0:00:00.047) 0:02:35.888 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 31 August 2024 18:38:08 -0400 (0:00:00.051) 0:02:35.940 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 31 August 2024 18:38:08 -0400 (0:00:00.071) 0:02:36.011 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 31 August 2024 18:38:08 -0400 (0:00:00.052) 0:02:36.064 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 31 August 2024 18:38:08 -0400 (0:00:00.055) 0:02:36.120 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 31 August 2024 18:38:08 -0400 (0:00:00.058) 0:02:36.178 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 31 August 2024 18:38:08 -0400 (0:00:00.080) 0:02:36.258 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 31 August 2024 18:38:08 -0400 (0:00:00.070) 0:02:36.328 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 31 August 2024 18:38:08 -0400 (0:00:00.054) 0:02:36.383 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 31 August 2024 18:38:08 -0400 (0:00:00.068) 0:02:36.451 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 31 August 2024 18:38:08 -0400 (0:00:00.082) 0:02:36.533 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 31 August 2024 18:38:08 -0400 (0:00:00.059) 0:02:36.593 ******* skipping: [managed_node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 31 August 2024 18:38:08 -0400 (0:00:00.053) 0:02:36.647 ******* skipping: [managed_node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 31 August 2024 18:38:08 -0400 (0:00:00.052) 0:02:36.699 ******* skipping: [managed_node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 31 August 2024 18:38:08 -0400 (0:00:00.058) 0:02:36.757 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 31 August 2024 18:38:08 -0400 (0:00:00.051) 0:02:36.809 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 31 August 2024 18:38:09 -0400 (0:00:00.047) 0:02:36.857 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 31 August 2024 18:38:09 -0400 (0:00:00.050) 0:02:36.907 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 31 August 2024 18:38:09 -0400 (0:00:00.048) 0:02:36.956 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 31 August 2024 18:38:09 -0400 (0:00:00.049) 0:02:37.005 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 31 August 2024 18:38:09 -0400 (0:00:00.048) 0:02:37.054 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 31 August 2024 18:38:09 -0400 (0:00:00.048) 0:02:37.102 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 31 August 2024 18:38:09 -0400 (0:00:00.047) 0:02:37.150 ******* skipping: [managed_node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 31 August 2024 18:38:09 -0400 (0:00:00.053) 0:02:37.203 ******* skipping: [managed_node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 31 August 2024 18:38:09 -0400 (0:00:00.051) 0:02:37.254 ******* skipping: [managed_node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 31 August 2024 18:38:09 -0400 (0:00:00.082) 0:02:37.337 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 31 August 2024 18:38:09 -0400 (0:00:00.078) 0:02:37.416 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 31 August 2024 18:38:09 -0400 (0:00:00.069) 0:02:37.485 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 31 August 2024 18:38:09 -0400 (0:00:00.079) 0:02:37.565 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 31 August 2024 18:38:09 -0400 (0:00:00.088) 0:02:37.653 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 31 August 2024 18:38:09 -0400 (0:00:00.076) 0:02:37.730 ******* ok: [managed_node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 31 August 2024 18:38:09 -0400 (0:00:00.066) 0:02:37.796 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 31 August 2024 18:38:10 -0400 (0:00:00.051) 0:02:37.848 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 31 August 2024 18:38:10 -0400 (0:00:00.056) 0:02:37.904 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 31 August 2024 18:38:10 -0400 (0:00:00.045) 0:02:37.950 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 31 August 2024 18:38:10 -0400 (0:00:00.043) 0:02:37.993 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 31 August 2024 18:38:10 -0400 (0:00:00.035) 0:02:38.029 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 31 August 2024 18:38:10 -0400 (0:00:00.041) 0:02:38.070 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 31 August 2024 18:38:10 -0400 (0:00:00.054) 0:02:38.125 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 31 August 2024 18:38:10 -0400 (0:00:00.048) 0:02:38.174 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 31 August 2024 18:38:10 -0400 (0:00:00.061) 0:02:38.235 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 31 August 2024 18:38:10 -0400 (0:00:00.058) 0:02:38.293 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:178 Saturday 31 August 2024 18:38:10 -0400 (0:00:00.053) 0:02:38.346 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed_node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 31 August 2024 18:38:10 -0400 (0:00:00.102) 0:02:38.449 ******* ok: [managed_node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 31 August 2024 18:38:10 -0400 (0:00:00.045) 0:02:38.494 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 31 August 2024 18:38:10 -0400 (0:00:00.075) 0:02:38.570 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 31 August 2024 18:38:10 -0400 (0:00:00.103) 0:02:38.673 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 31 August 2024 18:38:10 -0400 (0:00:00.088) 0:02:38.762 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed_node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 31 August 2024 18:38:11 -0400 (0:00:00.156) 0:02:38.918 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 31 August 2024 18:38:11 -0400 (0:00:00.112) 0:02:39.031 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 31 August 2024 18:38:11 -0400 (0:00:00.051) 0:02:39.083 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 31 August 2024 18:38:11 -0400 (0:00:00.059) 0:02:39.142 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 31 August 2024 18:38:11 -0400 (0:00:00.089) 0:02:39.232 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 31 August 2024 18:38:11 -0400 (0:00:00.210) 0:02:39.442 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 31 August 2024 18:38:13 -0400 (0:00:01.525) 0:02:40.968 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 31 August 2024 18:38:13 -0400 (0:00:00.121) 0:02:41.090 ******* ok: [managed_node2] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 31 August 2024 18:38:13 -0400 (0:00:00.069) 0:02:41.160 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 31 August 2024 18:38:17 -0400 (0:00:04.211) 0:02:45.372 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 31 August 2024 18:38:17 -0400 (0:00:00.102) 0:02:45.475 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 31 August 2024 18:38:17 -0400 (0:00:00.051) 0:02:45.526 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 31 August 2024 18:38:17 -0400 (0:00:00.050) 0:02:45.577 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 31 August 2024 18:38:17 -0400 (0:00:00.048) 0:02:45.626 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 31 August 2024 18:38:18 -0400 (0:00:00.757) 0:02:46.383 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 31 August 2024 18:38:19 -0400 (0:00:01.038) 0:02:47.421 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 31 August 2024 18:38:19 -0400 (0:00:00.059) 0:02:47.481 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 31 August 2024 18:38:19 -0400 (0:00:00.058) 0:02:47.540 ******* fatal: [managed_node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Saturday 31 August 2024 18:38:23 -0400 (0:00:04.235) 0:02:51.775 ******* fatal: [managed_node2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': False, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'partition', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'test1' missing key/password"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 31 August 2024 18:38:24 -0400 (0:00:00.098) 0:02:51.874 ******* TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 31 August 2024 18:38:24 -0400 (0:00:00.067) 0:02:51.941 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 31 August 2024 18:38:24 -0400 (0:00:00.056) 0:02:51.998 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 31 August 2024 18:38:24 -0400 (0:00:00.094) 0:02:52.093 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:197 Saturday 31 August 2024 18:38:24 -0400 (0:00:00.073) 0:02:52.166 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 31 August 2024 18:38:24 -0400 (0:00:00.298) 0:02:52.465 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 31 August 2024 18:38:24 -0400 (0:00:00.103) 0:02:52.569 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 31 August 2024 18:38:24 -0400 (0:00:00.066) 0:02:52.635 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed_node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 31 August 2024 18:38:24 -0400 (0:00:00.171) 0:02:52.807 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 31 August 2024 18:38:25 -0400 (0:00:00.088) 0:02:52.895 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 31 August 2024 18:38:25 -0400 (0:00:00.055) 0:02:52.951 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 31 August 2024 18:38:25 -0400 (0:00:00.085) 0:02:53.036 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 31 August 2024 18:38:25 -0400 (0:00:00.072) 0:02:53.109 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 31 August 2024 18:38:25 -0400 (0:00:00.177) 0:02:53.286 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 31 August 2024 18:38:27 -0400 (0:00:01.609) 0:02:54.895 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 31 August 2024 18:38:27 -0400 (0:00:00.062) 0:02:54.957 ******* ok: [managed_node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 31 August 2024 18:38:27 -0400 (0:00:00.078) 0:02:55.036 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 31 August 2024 18:38:31 -0400 (0:00:04.429) 0:02:59.466 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 31 August 2024 18:38:31 -0400 (0:00:00.093) 0:02:59.559 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 31 August 2024 18:38:31 -0400 (0:00:00.050) 0:02:59.609 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 31 August 2024 18:38:31 -0400 (0:00:00.071) 0:02:59.681 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 31 August 2024 18:38:31 -0400 (0:00:00.106) 0:02:59.787 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 31 August 2024 18:38:33 -0400 (0:00:01.107) 0:03:00.895 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 31 August 2024 18:38:34 -0400 (0:00:01.245) 0:03:02.141 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 31 August 2024 18:38:34 -0400 (0:00:00.077) 0:03:02.218 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 31 August 2024 18:38:34 -0400 (0:00:00.048) 0:03:02.266 ******* changed: [managed_node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 31 August 2024 18:38:45 -0400 (0:00:11.215) 0:03:13.482 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 31 August 2024 18:38:45 -0400 (0:00:00.098) 0:03:13.581 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143878.2171016, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "86110fff99dba498ba8a2a123f8cff0fd7e51439", "ctime": 1725143878.2131016, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263550, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1725143878.2131016, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744071584693682", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 31 August 2024 18:38:46 -0400 (0:00:00.592) 0:03:14.173 ******* ok: [managed_node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 31 August 2024 18:38:46 -0400 (0:00:00.477) 0:03:14.650 ******* TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 31 August 2024 18:38:46 -0400 (0:00:00.078) 0:03:14.728 ******* ok: [managed_node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 31 August 2024 18:38:47 -0400 (0:00:00.140) 0:03:14.869 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 31 August 2024 18:38:47 -0400 (0:00:00.110) 0:03:14.979 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 31 August 2024 18:38:47 -0400 (0:00:00.076) 0:03:15.056 ******* changed: [managed_node2] => (item={u'src': u'/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 31 August 2024 18:38:47 -0400 (0:00:00.723) 0:03:15.779 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 31 August 2024 18:38:48 -0400 (0:00:00.836) 0:03:16.616 ******* changed: [managed_node2] => (item={u'src': u'/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 31 August 2024 18:38:49 -0400 (0:00:00.656) 0:03:17.272 ******* skipping: [managed_node2] => (item={u'src': u'/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 31 August 2024 18:38:49 -0400 (0:00:00.073) 0:03:17.346 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 31 August 2024 18:38:50 -0400 (0:00:00.725) 0:03:18.071 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143882.394175, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "f7639b2214d2a2fb28333d53fed9f3e8cad3e443", "ctime": 1725143879.8211298, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 917510, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1725143879.8211298, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "18446744071584696285", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 31 August 2024 18:38:50 -0400 (0:00:00.455) 0:03:18.527 ******* changed: [managed_node2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed_node2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 31 August 2024 18:38:51 -0400 (0:00:00.831) 0:03:19.358 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:214 Saturday 31 August 2024 18:38:52 -0400 (0:00:00.981) 0:03:20.340 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 31 August 2024 18:38:52 -0400 (0:00:00.241) 0:03:20.582 ******* ok: [managed_node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 31 August 2024 18:38:52 -0400 (0:00:00.086) 0:03:20.668 ******* skipping: [managed_node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 31 August 2024 18:38:52 -0400 (0:00:00.098) 0:03:20.767 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "size": "10G", "type": "crypt", "uuid": "d0121e05-120d-4efd-842d-2018fdb19643" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 31 August 2024 18:38:53 -0400 (0:00:00.666) 0:03:21.434 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003296", "end": "2024-08-31 18:38:53.976313", "rc": 0, "start": "2024-08-31 18:38:53.973017" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 31 August 2024 18:38:54 -0400 (0:00:00.449) 0:03:21.883 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003296", "end": "2024-08-31 18:38:54.386526", "failed_when_result": false, "rc": 0, "start": "2024-08-31 18:38:54.383230" } STDOUT: luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8 /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 31 August 2024 18:38:54 -0400 (0:00:00.402) 0:03:22.285 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed_node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 31 August 2024 18:38:54 -0400 (0:00:00.135) 0:03:22.421 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 31 August 2024 18:38:54 -0400 (0:00:00.104) 0:03:22.525 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 31 August 2024 18:38:54 -0400 (0:00:00.106) 0:03:22.632 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 31 August 2024 18:38:54 -0400 (0:00:00.119) 0:03:22.752 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 31 August 2024 18:38:55 -0400 (0:00:00.194) 0:03:22.947 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 31 August 2024 18:38:55 -0400 (0:00:00.123) 0:03:23.071 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 31 August 2024 18:38:55 -0400 (0:00:00.077) 0:03:23.149 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 31 August 2024 18:38:55 -0400 (0:00:00.068) 0:03:23.217 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 31 August 2024 18:38:55 -0400 (0:00:00.053) 0:03:23.271 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 31 August 2024 18:38:55 -0400 (0:00:00.063) 0:03:23.334 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 31 August 2024 18:38:55 -0400 (0:00:00.049) 0:03:23.383 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 31 August 2024 18:38:55 -0400 (0:00:00.054) 0:03:23.438 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 31 August 2024 18:38:55 -0400 (0:00:00.051) 0:03:23.489 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 31 August 2024 18:38:55 -0400 (0:00:00.053) 0:03:23.543 ******* ok: [managed_node2] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.12.24 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 31 August 2024 18:38:56 -0400 (0:00:00.518) 0:03:24.061 ******* TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 31 August 2024 18:38:56 -0400 (0:00:00.087) 0:03:24.149 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed_node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 31 August 2024 18:38:56 -0400 (0:00:00.129) 0:03:24.278 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 31 August 2024 18:38:56 -0400 (0:00:00.051) 0:03:24.330 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 31 August 2024 18:38:56 -0400 (0:00:00.072) 0:03:24.403 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 31 August 2024 18:38:56 -0400 (0:00:00.067) 0:03:24.471 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 31 August 2024 18:38:56 -0400 (0:00:00.054) 0:03:24.525 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 31 August 2024 18:38:56 -0400 (0:00:00.050) 0:03:24.575 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 31 August 2024 18:38:56 -0400 (0:00:00.049) 0:03:24.625 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 31 August 2024 18:38:56 -0400 (0:00:00.049) 0:03:24.675 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 31 August 2024 18:38:56 -0400 (0:00:00.048) 0:03:24.724 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 31 August 2024 18:38:56 -0400 (0:00:00.049) 0:03:24.774 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 31 August 2024 18:38:57 -0400 (0:00:00.063) 0:03:24.837 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 31 August 2024 18:38:57 -0400 (0:00:00.078) 0:03:24.915 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed_node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 31 August 2024 18:38:57 -0400 (0:00:00.108) 0:03:25.023 ******* skipping: [managed_node2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 31 August 2024 18:38:57 -0400 (0:00:00.074) 0:03:25.098 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed_node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 31 August 2024 18:38:57 -0400 (0:00:00.109) 0:03:25.207 ******* skipping: [managed_node2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 31 August 2024 18:38:57 -0400 (0:00:00.080) 0:03:25.288 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 31 August 2024 18:38:57 -0400 (0:00:00.126) 0:03:25.414 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 31 August 2024 18:38:57 -0400 (0:00:00.059) 0:03:25.474 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 31 August 2024 18:38:57 -0400 (0:00:00.053) 0:03:25.527 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 31 August 2024 18:38:57 -0400 (0:00:00.052) 0:03:25.580 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 31 August 2024 18:38:57 -0400 (0:00:00.051) 0:03:25.631 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed_node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 31 August 2024 18:38:57 -0400 (0:00:00.183) 0:03:25.814 ******* skipping: [managed_node2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 31 August 2024 18:38:58 -0400 (0:00:00.079) 0:03:25.894 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed_node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 31 August 2024 18:38:58 -0400 (0:00:00.159) 0:03:26.054 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 31 August 2024 18:38:58 -0400 (0:00:00.056) 0:03:26.110 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 31 August 2024 18:38:58 -0400 (0:00:00.052) 0:03:26.162 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 31 August 2024 18:38:58 -0400 (0:00:00.051) 0:03:26.214 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 31 August 2024 18:38:58 -0400 (0:00:00.047) 0:03:26.261 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 31 August 2024 18:38:58 -0400 (0:00:00.042) 0:03:26.304 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 31 August 2024 18:38:58 -0400 (0:00:00.043) 0:03:26.347 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 31 August 2024 18:38:58 -0400 (0:00:00.066) 0:03:26.414 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed_node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 31 August 2024 18:38:58 -0400 (0:00:00.121) 0:03:26.535 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 31 August 2024 18:38:58 -0400 (0:00:00.070) 0:03:26.606 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed_node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 31 August 2024 18:38:59 -0400 (0:00:00.351) 0:03:26.957 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 31 August 2024 18:38:59 -0400 (0:00:00.083) 0:03:27.041 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 31 August 2024 18:38:59 -0400 (0:00:00.071) 0:03:27.113 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 31 August 2024 18:38:59 -0400 (0:00:00.073) 0:03:27.186 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 31 August 2024 18:38:59 -0400 (0:00:00.062) 0:03:27.249 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 31 August 2024 18:38:59 -0400 (0:00:00.052) 0:03:27.301 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 31 August 2024 18:38:59 -0400 (0:00:00.051) 0:03:27.352 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 31 August 2024 18:38:59 -0400 (0:00:00.070) 0:03:27.423 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 31 August 2024 18:38:59 -0400 (0:00:00.085) 0:03:27.509 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 31 August 2024 18:38:59 -0400 (0:00:00.050) 0:03:27.560 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 31 August 2024 18:38:59 -0400 (0:00:00.062) 0:03:27.622 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 31 August 2024 18:38:59 -0400 (0:00:00.084) 0:03:27.706 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 31 August 2024 18:39:00 -0400 (0:00:00.132) 0:03:27.838 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 31 August 2024 18:39:00 -0400 (0:00:00.142) 0:03:27.980 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 31 August 2024 18:39:00 -0400 (0:00:00.137) 0:03:28.118 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 31 August 2024 18:39:00 -0400 (0:00:00.143) 0:03:28.261 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 31 August 2024 18:39:00 -0400 (0:00:00.071) 0:03:28.333 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 31 August 2024 18:39:00 -0400 (0:00:00.069) 0:03:28.402 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 31 August 2024 18:39:00 -0400 (0:00:00.069) 0:03:28.472 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 31 August 2024 18:39:00 -0400 (0:00:00.075) 0:03:28.547 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143925.25993, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1725143925.25993, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 194804, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1725143925.25993, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 31 August 2024 18:39:01 -0400 (0:00:00.489) 0:03:29.037 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 31 August 2024 18:39:01 -0400 (0:00:00.074) 0:03:29.112 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 31 August 2024 18:39:01 -0400 (0:00:00.054) 0:03:29.166 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 31 August 2024 18:39:01 -0400 (0:00:00.067) 0:03:29.233 ******* ok: [managed_node2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 31 August 2024 18:39:01 -0400 (0:00:00.060) 0:03:29.294 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 31 August 2024 18:39:01 -0400 (0:00:00.055) 0:03:29.350 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 31 August 2024 18:39:01 -0400 (0:00:00.066) 0:03:29.416 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143925.4049325, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1725143925.4049325, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 194859, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1725143925.4049325, "nlink": 1, "path": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 31 August 2024 18:39:02 -0400 (0:00:00.612) 0:03:30.029 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 31 August 2024 18:39:03 -0400 (0:00:00.917) 0:03:30.947 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.028072", "end": "2024-08-31 18:39:03.501475", "rc": 0, "start": "2024-08-31 18:39:03.473403" } STDOUT: LUKS header information for /dev/sda1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: d6 d0 c8 37 ee 86 f1 90 7b fb 63 1a 12 9d 28 8c 32 4d c6 49 MK salt: 5d 9a 4d 43 77 a6 40 bd 34 3c 7d 73 3b 32 25 f2 0a 49 8a b1 55 61 e0 ba d0 27 06 e8 46 81 06 47 MK iterations: 20660 UUID: 93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8 Key Slot 0: ENABLED Iterations: 327270 Salt: 12 97 75 ec 4f 1e 6a d5 7a 35 ba 2d 3a 95 4d 2a ee db 43 2a e8 0b bf 42 df 96 bc e5 b5 3a 45 91 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 31 August 2024 18:39:03 -0400 (0:00:00.463) 0:03:31.411 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 31 August 2024 18:39:03 -0400 (0:00:00.079) 0:03:31.490 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 31 August 2024 18:39:03 -0400 (0:00:00.069) 0:03:31.560 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 31 August 2024 18:39:03 -0400 (0:00:00.147) 0:03:31.708 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 31 August 2024 18:39:03 -0400 (0:00:00.059) 0:03:31.768 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 31 August 2024 18:39:04 -0400 (0:00:00.066) 0:03:31.834 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 31 August 2024 18:39:04 -0400 (0:00:00.055) 0:03:31.889 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 31 August 2024 18:39:04 -0400 (0:00:00.056) 0:03:31.946 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8 /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 31 August 2024 18:39:04 -0400 (0:00:00.069) 0:03:32.015 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 31 August 2024 18:39:04 -0400 (0:00:00.064) 0:03:32.079 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 31 August 2024 18:39:04 -0400 (0:00:00.063) 0:03:32.143 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 31 August 2024 18:39:04 -0400 (0:00:00.064) 0:03:32.207 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 31 August 2024 18:39:04 -0400 (0:00:00.063) 0:03:32.271 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 31 August 2024 18:39:04 -0400 (0:00:00.054) 0:03:32.325 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 31 August 2024 18:39:04 -0400 (0:00:00.110) 0:03:32.435 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 31 August 2024 18:39:04 -0400 (0:00:00.056) 0:03:32.492 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 31 August 2024 18:39:04 -0400 (0:00:00.083) 0:03:32.575 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 31 August 2024 18:39:04 -0400 (0:00:00.095) 0:03:32.671 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.174) 0:03:32.846 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.097) 0:03:32.943 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.052) 0:03:32.996 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.054) 0:03:33.050 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.054) 0:03:33.105 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.059) 0:03:33.164 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.053) 0:03:33.218 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.061) 0:03:33.280 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.052) 0:03:33.332 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.046) 0:03:33.378 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.035) 0:03:33.414 ******* skipping: [managed_node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.036) 0:03:33.451 ******* skipping: [managed_node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.034) 0:03:33.485 ******* skipping: [managed_node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.033) 0:03:33.519 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.033) 0:03:33.552 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.038) 0:03:33.591 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.034) 0:03:33.626 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.034) 0:03:33.661 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.034) 0:03:33.696 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.034) 0:03:33.730 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.033) 0:03:33.764 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 31 August 2024 18:39:05 -0400 (0:00:00.035) 0:03:33.800 ******* skipping: [managed_node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 31 August 2024 18:39:06 -0400 (0:00:00.033) 0:03:33.833 ******* skipping: [managed_node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 31 August 2024 18:39:06 -0400 (0:00:00.038) 0:03:33.872 ******* skipping: [managed_node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 31 August 2024 18:39:06 -0400 (0:00:00.032) 0:03:33.904 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 31 August 2024 18:39:06 -0400 (0:00:00.032) 0:03:33.937 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 31 August 2024 18:39:06 -0400 (0:00:00.031) 0:03:33.968 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 31 August 2024 18:39:06 -0400 (0:00:00.033) 0:03:34.002 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 31 August 2024 18:39:06 -0400 (0:00:00.032) 0:03:34.035 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 31 August 2024 18:39:06 -0400 (0:00:00.032) 0:03:34.067 ******* ok: [managed_node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 31 August 2024 18:39:06 -0400 (0:00:00.038) 0:03:34.105 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 31 August 2024 18:39:06 -0400 (0:00:00.036) 0:03:34.142 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 31 August 2024 18:39:06 -0400 (0:00:00.035) 0:03:34.178 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 31 August 2024 18:39:06 -0400 (0:00:00.036) 0:03:34.214 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 31 August 2024 18:39:06 -0400 (0:00:00.034) 0:03:34.249 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 31 August 2024 18:39:06 -0400 (0:00:00.035) 0:03:34.284 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 31 August 2024 18:39:06 -0400 (0:00:00.034) 0:03:34.318 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 31 August 2024 18:39:06 -0400 (0:00:00.034) 0:03:34.353 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 31 August 2024 18:39:06 -0400 (0:00:00.052) 0:03:34.405 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 31 August 2024 18:39:06 -0400 (0:00:00.059) 0:03:34.464 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 31 August 2024 18:39:06 -0400 (0:00:00.054) 0:03:34.519 ******* TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 31 August 2024 18:39:06 -0400 (0:00:00.051) 0:03:34.571 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 31 August 2024 18:39:06 -0400 (0:00:00.055) 0:03:34.626 ******* changed: [managed_node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:220 Saturday 31 August 2024 18:39:07 -0400 (0:00:00.425) 0:03:35.052 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed_node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 31 August 2024 18:39:07 -0400 (0:00:00.124) 0:03:35.176 ******* ok: [managed_node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 31 August 2024 18:39:07 -0400 (0:00:00.043) 0:03:35.220 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 31 August 2024 18:39:07 -0400 (0:00:00.051) 0:03:35.271 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 31 August 2024 18:39:07 -0400 (0:00:00.055) 0:03:35.326 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 31 August 2024 18:39:07 -0400 (0:00:00.101) 0:03:35.428 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed_node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 31 August 2024 18:39:07 -0400 (0:00:00.084) 0:03:35.512 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 31 August 2024 18:39:07 -0400 (0:00:00.034) 0:03:35.546 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 31 August 2024 18:39:07 -0400 (0:00:00.034) 0:03:35.580 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 31 August 2024 18:39:07 -0400 (0:00:00.033) 0:03:35.614 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 31 August 2024 18:39:07 -0400 (0:00:00.041) 0:03:35.655 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 31 August 2024 18:39:07 -0400 (0:00:00.081) 0:03:35.737 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 31 August 2024 18:39:09 -0400 (0:00:01.290) 0:03:37.027 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 31 August 2024 18:39:09 -0400 (0:00:00.052) 0:03:37.079 ******* ok: [managed_node2] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 31 August 2024 18:39:09 -0400 (0:00:00.060) 0:03:37.139 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 31 August 2024 18:39:13 -0400 (0:00:04.153) 0:03:41.293 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 31 August 2024 18:39:13 -0400 (0:00:00.133) 0:03:41.426 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 31 August 2024 18:39:13 -0400 (0:00:00.072) 0:03:41.499 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 31 August 2024 18:39:13 -0400 (0:00:00.086) 0:03:41.586 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 31 August 2024 18:39:13 -0400 (0:00:00.051) 0:03:41.637 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 31 August 2024 18:39:14 -0400 (0:00:00.765) 0:03:42.403 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d6dc95608\\x2d1152\\x2d4f68\\x2d9e9a\\x2dbf366f9ae1bd.service": { "name": "systemd-cryptsetup@luks\\x2d6dc95608\\x2d1152\\x2d4f68\\x2d9e9a\\x2dbf366f9ae1bd.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 31 August 2024 18:39:15 -0400 (0:00:01.153) 0:03:43.556 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d6dc95608\\x2d1152\\x2d4f68\\x2d9e9a\\x2dbf366f9ae1bd.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 31 August 2024 18:39:15 -0400 (0:00:00.100) 0:03:43.657 ******* changed: [managed_node2] => (item=systemd-cryptsetup@luks\x2d6dc95608\x2d1152\x2d4f68\x2d9e9a\x2dbf366f9ae1bd.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d6dc95608\\x2d1152\\x2d4f68\\x2d9e9a\\x2dbf366f9ae1bd.service", "name": "systemd-cryptsetup@luks\\x2d6dc95608\\x2d1152\\x2d4f68\\x2d9e9a\\x2dbf366f9ae1bd.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target systemd-readahead-collect.service dev-sda.device systemd-readahead-replay.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-6dc95608-1152-4f68-9e9a-bf366f9ae1bd ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d6dc95608\\x2d1152\\x2d4f68\\x2d9e9a\\x2dbf366f9ae1bd.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d6dc95608\\x2d1152\\x2d4f68\\x2d9e9a\\x2dbf366f9ae1bd.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d6dc95608\\x2d1152\\x2d4f68\\x2d9e9a\\x2dbf366f9ae1bd.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 31 August 2024 18:39:16 -0400 (0:00:00.677) 0:03:44.334 ******* fatal: [managed_node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Saturday 31 August 2024 18:39:21 -0400 (0:00:04.550) 0:03:48.884 ******* fatal: [managed_node2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'partition', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8' in safe mode due to encryption removal"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 31 August 2024 18:39:21 -0400 (0:00:00.140) 0:03:49.025 ******* changed: [managed_node2] => (item=systemd-cryptsetup@luks\x2d6dc95608\x2d1152\x2d4f68\x2d9e9a\x2dbf366f9ae1bd.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d6dc95608\\x2d1152\\x2d4f68\\x2d9e9a\\x2dbf366f9ae1bd.service", "name": "systemd-cryptsetup@luks\\x2d6dc95608\\x2d1152\\x2d4f68\\x2d9e9a\\x2dbf366f9ae1bd.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d6dc95608\\x2d1152\\x2d4f68\\x2d9e9a\\x2dbf366f9ae1bd.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d6dc95608\\x2d1152\\x2d4f68\\x2d9e9a\\x2dbf366f9ae1bd.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d6dc95608\\x2d1152\\x2d4f68\\x2d9e9a\\x2dbf366f9ae1bd.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 31 August 2024 18:39:21 -0400 (0:00:00.697) 0:03:49.722 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 31 August 2024 18:39:21 -0400 (0:00:00.047) 0:03:49.770 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 31 August 2024 18:39:22 -0400 (0:00:00.074) 0:03:49.844 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 31 August 2024 18:39:22 -0400 (0:00:00.043) 0:03:49.888 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143947.1453152, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1725143947.1453152, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1725143947.1453152, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "210409782", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 31 August 2024 18:39:22 -0400 (0:00:00.383) 0:03:50.271 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:244 Saturday 31 August 2024 18:39:22 -0400 (0:00:00.057) 0:03:50.328 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 31 August 2024 18:39:22 -0400 (0:00:00.282) 0:03:50.611 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 31 August 2024 18:39:22 -0400 (0:00:00.083) 0:03:50.694 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 31 August 2024 18:39:22 -0400 (0:00:00.063) 0:03:50.757 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed_node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 31 August 2024 18:39:23 -0400 (0:00:00.120) 0:03:50.878 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 31 August 2024 18:39:23 -0400 (0:00:00.052) 0:03:50.930 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 31 August 2024 18:39:23 -0400 (0:00:00.055) 0:03:50.986 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 31 August 2024 18:39:23 -0400 (0:00:00.075) 0:03:51.061 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 31 August 2024 18:39:23 -0400 (0:00:00.052) 0:03:51.114 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 31 August 2024 18:39:23 -0400 (0:00:00.126) 0:03:51.241 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 31 August 2024 18:39:24 -0400 (0:00:01.360) 0:03:52.601 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 31 August 2024 18:39:24 -0400 (0:00:00.040) 0:03:52.642 ******* ok: [managed_node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 31 August 2024 18:39:24 -0400 (0:00:00.035) 0:03:52.677 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 31 August 2024 18:39:29 -0400 (0:00:04.451) 0:03:57.128 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 31 August 2024 18:39:29 -0400 (0:00:00.109) 0:03:57.238 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 31 August 2024 18:39:29 -0400 (0:00:00.057) 0:03:57.296 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 31 August 2024 18:39:29 -0400 (0:00:00.051) 0:03:57.347 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 31 August 2024 18:39:29 -0400 (0:00:00.051) 0:03:57.399 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 31 August 2024 18:39:30 -0400 (0:00:00.776) 0:03:58.176 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service": { "name": "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 31 August 2024 18:39:31 -0400 (0:00:01.199) 0:03:59.375 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 31 August 2024 18:39:31 -0400 (0:00:00.052) 0:03:59.427 ******* changed: [managed_node2] => (item=systemd-cryptsetup@luks\x2d93e795b2\x2dee2c\x2d4fdd\x2da9e2\x2d3e2d39be4ae8.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "name": "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda1.device system-systemd\\x2dcryptsetup.slice systemd-journald.socket systemd-readahead-replay.service cryptsetup-pre.target systemd-readahead-collect.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 31 August 2024 18:39:32 -0400 (0:00:00.577) 0:04:00.005 ******* changed: [managed_node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 31 August 2024 18:39:36 -0400 (0:00:04.742) 0:04:04.747 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 31 August 2024 18:39:36 -0400 (0:00:00.035) 0:04:04.783 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143929.3060012, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "720ab1ed0578bdceef2d1c134ecd1962d9ce28db", "ctime": 1725143929.302001, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263550, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1725143929.302001, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744071584693682", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 31 August 2024 18:39:37 -0400 (0:00:00.349) 0:04:05.132 ******* ok: [managed_node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 31 August 2024 18:39:37 -0400 (0:00:00.340) 0:04:05.473 ******* changed: [managed_node2] => (item=systemd-cryptsetup@luks\x2d93e795b2\x2dee2c\x2d4fdd\x2da9e2\x2d3e2d39be4ae8.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "name": "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "dev-mapper-luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 31 August 2024 18:39:38 -0400 (0:00:00.519) 0:04:05.992 ******* ok: [managed_node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 31 August 2024 18:39:38 -0400 (0:00:00.049) 0:04:06.041 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 31 August 2024 18:39:38 -0400 (0:00:00.044) 0:04:06.086 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 31 August 2024 18:39:38 -0400 (0:00:00.174) 0:04:06.260 ******* changed: [managed_node2] => (item={u'src': u'/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 31 August 2024 18:39:38 -0400 (0:00:00.366) 0:04:06.626 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 31 August 2024 18:39:39 -0400 (0:00:00.544) 0:04:07.171 ******* changed: [managed_node2] => (item={u'src': u'UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 31 August 2024 18:39:39 -0400 (0:00:00.425) 0:04:07.596 ******* skipping: [managed_node2] => (item={u'src': u'UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 31 August 2024 18:39:39 -0400 (0:00:00.092) 0:04:07.689 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 31 August 2024 18:39:40 -0400 (0:00:00.529) 0:04:08.218 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143934.3850906, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "5dc11e46f7cd620b109c326b8ca553417aaba389", "ctime": 1725143931.4220383, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 917510, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1725143931.4210384, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "18446744071584696439", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 31 August 2024 18:39:40 -0400 (0:00:00.371) 0:04:08.590 ******* changed: [managed_node2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 31 August 2024 18:39:41 -0400 (0:00:00.414) 0:04:09.005 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:261 Saturday 31 August 2024 18:39:41 -0400 (0:00:00.753) 0:04:09.759 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 31 August 2024 18:39:42 -0400 (0:00:00.097) 0:04:09.857 ******* ok: [managed_node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 31 August 2024 18:39:42 -0400 (0:00:00.044) 0:04:09.901 ******* skipping: [managed_node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 31 August 2024 18:39:42 -0400 (0:00:00.032) 0:04:09.933 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "0d98aff8-ed7f-4ffa-8d82-ae0b994268e2" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 31 August 2024 18:39:42 -0400 (0:00:00.379) 0:04:10.313 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003211", "end": "2024-08-31 18:39:42.777814", "rc": 0, "start": "2024-08-31 18:39:42.774603" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 31 August 2024 18:39:42 -0400 (0:00:00.384) 0:04:10.697 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003247", "end": "2024-08-31 18:39:43.167399", "failed_when_result": false, "rc": 0, "start": "2024-08-31 18:39:43.164152" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 31 August 2024 18:39:43 -0400 (0:00:00.361) 0:04:11.059 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed_node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 31 August 2024 18:39:43 -0400 (0:00:00.095) 0:04:11.154 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 31 August 2024 18:39:43 -0400 (0:00:00.035) 0:04:11.189 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 31 August 2024 18:39:43 -0400 (0:00:00.033) 0:04:11.223 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 31 August 2024 18:39:43 -0400 (0:00:00.071) 0:04:11.295 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 31 August 2024 18:39:43 -0400 (0:00:00.118) 0:04:11.414 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 31 August 2024 18:39:43 -0400 (0:00:00.054) 0:04:11.468 ******* TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 31 August 2024 18:39:43 -0400 (0:00:00.049) 0:04:11.518 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 31 August 2024 18:39:43 -0400 (0:00:00.052) 0:04:11.571 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 31 August 2024 18:39:43 -0400 (0:00:00.058) 0:04:11.630 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 31 August 2024 18:39:43 -0400 (0:00:00.050) 0:04:11.680 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 31 August 2024 18:39:43 -0400 (0:00:00.052) 0:04:11.733 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 31 August 2024 18:39:43 -0400 (0:00:00.050) 0:04:11.783 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 31 August 2024 18:39:44 -0400 (0:00:00.050) 0:04:11.834 ******* TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 31 August 2024 18:39:44 -0400 (0:00:00.058) 0:04:11.893 ******* ok: [managed_node2] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.12.24 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 31 August 2024 18:39:44 -0400 (0:00:00.347) 0:04:12.240 ******* TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 31 August 2024 18:39:44 -0400 (0:00:00.081) 0:04:12.322 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed_node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 31 August 2024 18:39:44 -0400 (0:00:00.140) 0:04:12.462 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 31 August 2024 18:39:44 -0400 (0:00:00.056) 0:04:12.519 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 31 August 2024 18:39:44 -0400 (0:00:00.064) 0:04:12.584 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 31 August 2024 18:39:44 -0400 (0:00:00.056) 0:04:12.640 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 31 August 2024 18:39:44 -0400 (0:00:00.055) 0:04:12.696 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 31 August 2024 18:39:44 -0400 (0:00:00.064) 0:04:12.760 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 31 August 2024 18:39:44 -0400 (0:00:00.054) 0:04:12.815 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 31 August 2024 18:39:45 -0400 (0:00:00.069) 0:04:12.884 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 31 August 2024 18:39:45 -0400 (0:00:00.091) 0:04:12.976 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 31 August 2024 18:39:45 -0400 (0:00:00.051) 0:04:13.028 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 31 August 2024 18:39:45 -0400 (0:00:00.048) 0:04:13.077 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 31 August 2024 18:39:45 -0400 (0:00:00.067) 0:04:13.144 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed_node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 31 August 2024 18:39:45 -0400 (0:00:00.122) 0:04:13.267 ******* skipping: [managed_node2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': 0, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 31 August 2024 18:39:45 -0400 (0:00:00.061) 0:04:13.329 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed_node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 31 August 2024 18:39:45 -0400 (0:00:00.109) 0:04:13.439 ******* skipping: [managed_node2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': 0, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 31 August 2024 18:39:45 -0400 (0:00:00.074) 0:04:13.513 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 31 August 2024 18:39:45 -0400 (0:00:00.160) 0:04:13.674 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 31 August 2024 18:39:45 -0400 (0:00:00.065) 0:04:13.740 ******* TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 31 August 2024 18:39:46 -0400 (0:00:00.081) 0:04:13.821 ******* TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 31 August 2024 18:39:46 -0400 (0:00:00.111) 0:04:13.933 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 31 August 2024 18:39:46 -0400 (0:00:00.136) 0:04:14.069 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed_node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 31 August 2024 18:39:46 -0400 (0:00:00.105) 0:04:14.174 ******* skipping: [managed_node2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': 0, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 31 August 2024 18:39:46 -0400 (0:00:00.061) 0:04:14.236 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed_node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 31 August 2024 18:39:46 -0400 (0:00:00.094) 0:04:14.330 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 31 August 2024 18:39:46 -0400 (0:00:00.035) 0:04:14.366 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 31 August 2024 18:39:46 -0400 (0:00:00.032) 0:04:14.398 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 31 August 2024 18:39:46 -0400 (0:00:00.033) 0:04:14.431 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 31 August 2024 18:39:46 -0400 (0:00:00.032) 0:04:14.463 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 31 August 2024 18:39:46 -0400 (0:00:00.079) 0:04:14.543 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 31 August 2024 18:39:46 -0400 (0:00:00.041) 0:04:14.585 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 31 August 2024 18:39:46 -0400 (0:00:00.051) 0:04:14.636 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed_node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 31 August 2024 18:39:46 -0400 (0:00:00.106) 0:04:14.742 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 31 August 2024 18:39:46 -0400 (0:00:00.063) 0:04:14.805 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed_node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 31 August 2024 18:39:47 -0400 (0:00:00.301) 0:04:15.107 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 31 August 2024 18:39:47 -0400 (0:00:00.077) 0:04:15.185 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 31 August 2024 18:39:47 -0400 (0:00:00.069) 0:04:15.254 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 31 August 2024 18:39:47 -0400 (0:00:00.057) 0:04:15.312 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 31 August 2024 18:39:47 -0400 (0:00:00.063) 0:04:15.376 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 31 August 2024 18:39:47 -0400 (0:00:00.073) 0:04:15.450 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 31 August 2024 18:39:47 -0400 (0:00:00.054) 0:04:15.504 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 31 August 2024 18:39:47 -0400 (0:00:00.076) 0:04:15.581 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 31 August 2024 18:39:47 -0400 (0:00:00.097) 0:04:15.678 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 31 August 2024 18:39:47 -0400 (0:00:00.057) 0:04:15.735 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 31 August 2024 18:39:47 -0400 (0:00:00.055) 0:04:15.791 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 31 August 2024 18:39:48 -0400 (0:00:00.054) 0:04:15.846 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 31 August 2024 18:39:48 -0400 (0:00:00.098) 0:04:15.944 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 31 August 2024 18:39:48 -0400 (0:00:00.064) 0:04:16.009 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 31 August 2024 18:39:48 -0400 (0:00:00.063) 0:04:16.072 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 31 August 2024 18:39:48 -0400 (0:00:00.050) 0:04:16.122 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 31 August 2024 18:39:48 -0400 (0:00:00.075) 0:04:16.197 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 31 August 2024 18:39:48 -0400 (0:00:00.068) 0:04:16.266 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 31 August 2024 18:39:48 -0400 (0:00:00.075) 0:04:16.341 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 31 August 2024 18:39:48 -0400 (0:00:00.075) 0:04:16.417 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143976.8168373, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1725143976.8168373, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 206354, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1725143976.8168373, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 31 August 2024 18:39:49 -0400 (0:00:00.430) 0:04:16.847 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 31 August 2024 18:39:49 -0400 (0:00:00.067) 0:04:16.915 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 31 August 2024 18:39:49 -0400 (0:00:00.051) 0:04:16.966 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 31 August 2024 18:39:49 -0400 (0:00:00.061) 0:04:17.028 ******* ok: [managed_node2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 31 August 2024 18:39:49 -0400 (0:00:00.056) 0:04:17.084 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 31 August 2024 18:39:49 -0400 (0:00:00.053) 0:04:17.138 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 31 August 2024 18:39:49 -0400 (0:00:00.059) 0:04:17.198 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 31 August 2024 18:39:49 -0400 (0:00:00.052) 0:04:17.251 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 31 August 2024 18:39:50 -0400 (0:00:00.772) 0:04:18.023 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 31 August 2024 18:39:50 -0400 (0:00:00.032) 0:04:18.056 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 31 August 2024 18:39:50 -0400 (0:00:00.032) 0:04:18.089 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 31 August 2024 18:39:50 -0400 (0:00:00.042) 0:04:18.132 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 31 August 2024 18:39:50 -0400 (0:00:00.031) 0:04:18.163 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 31 August 2024 18:39:50 -0400 (0:00:00.033) 0:04:18.197 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 31 August 2024 18:39:50 -0400 (0:00:00.031) 0:04:18.228 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 31 August 2024 18:39:50 -0400 (0:00:00.031) 0:04:18.260 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 31 August 2024 18:39:50 -0400 (0:00:00.031) 0:04:18.291 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 31 August 2024 18:39:50 -0400 (0:00:00.054) 0:04:18.345 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 31 August 2024 18:39:50 -0400 (0:00:00.059) 0:04:18.404 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 31 August 2024 18:39:50 -0400 (0:00:00.053) 0:04:18.458 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 31 August 2024 18:39:50 -0400 (0:00:00.051) 0:04:18.510 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 31 August 2024 18:39:50 -0400 (0:00:00.052) 0:04:18.562 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 31 August 2024 18:39:50 -0400 (0:00:00.053) 0:04:18.616 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 31 August 2024 18:39:50 -0400 (0:00:00.052) 0:04:18.668 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 31 August 2024 18:39:50 -0400 (0:00:00.052) 0:04:18.720 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 31 August 2024 18:39:50 -0400 (0:00:00.056) 0:04:18.776 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 31 August 2024 18:39:51 -0400 (0:00:00.053) 0:04:18.830 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 31 August 2024 18:39:51 -0400 (0:00:00.067) 0:04:18.897 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 31 August 2024 18:39:51 -0400 (0:00:00.062) 0:04:18.960 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 31 August 2024 18:39:51 -0400 (0:00:00.060) 0:04:19.020 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 31 August 2024 18:39:51 -0400 (0:00:00.053) 0:04:19.073 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 31 August 2024 18:39:51 -0400 (0:00:00.162) 0:04:19.235 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 31 August 2024 18:39:51 -0400 (0:00:00.077) 0:04:19.312 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 31 August 2024 18:39:51 -0400 (0:00:00.044) 0:04:19.357 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 31 August 2024 18:39:51 -0400 (0:00:00.058) 0:04:19.416 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 31 August 2024 18:39:51 -0400 (0:00:00.061) 0:04:19.477 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 31 August 2024 18:39:51 -0400 (0:00:00.065) 0:04:19.542 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 31 August 2024 18:39:51 -0400 (0:00:00.065) 0:04:19.607 ******* skipping: [managed_node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 31 August 2024 18:39:51 -0400 (0:00:00.061) 0:04:19.669 ******* skipping: [managed_node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 31 August 2024 18:39:51 -0400 (0:00:00.049) 0:04:19.719 ******* skipping: [managed_node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 31 August 2024 18:39:51 -0400 (0:00:00.042) 0:04:19.761 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 31 August 2024 18:39:51 -0400 (0:00:00.049) 0:04:19.810 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.046) 0:04:19.857 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.042) 0:04:19.900 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.035) 0:04:19.935 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.034) 0:04:19.969 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.034) 0:04:20.004 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.034) 0:04:20.038 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.033) 0:04:20.072 ******* skipping: [managed_node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.036) 0:04:20.108 ******* skipping: [managed_node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.033) 0:04:20.142 ******* skipping: [managed_node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.033) 0:04:20.176 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.032) 0:04:20.209 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.034) 0:04:20.243 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.039) 0:04:20.282 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.045) 0:04:20.327 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.082) 0:04:20.410 ******* ok: [managed_node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.055) 0:04:20.465 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.044) 0:04:20.510 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.043) 0:04:20.553 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.044) 0:04:20.598 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.041) 0:04:20.639 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.033) 0:04:20.673 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.033) 0:04:20.707 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.033) 0:04:20.741 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.033) 0:04:20.774 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 31 August 2024 18:39:52 -0400 (0:00:00.032) 0:04:20.807 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 31 August 2024 18:39:53 -0400 (0:00:00.035) 0:04:20.842 ******* TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 31 August 2024 18:39:53 -0400 (0:00:00.030) 0:04:20.873 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 31 August 2024 18:39:53 -0400 (0:00:00.032) 0:04:20.905 ******* changed: [managed_node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:267 Saturday 31 August 2024 18:39:53 -0400 (0:00:00.371) 0:04:21.277 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed_node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 31 August 2024 18:39:53 -0400 (0:00:00.180) 0:04:21.457 ******* ok: [managed_node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 31 August 2024 18:39:53 -0400 (0:00:00.056) 0:04:21.514 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 31 August 2024 18:39:53 -0400 (0:00:00.064) 0:04:21.579 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 31 August 2024 18:39:53 -0400 (0:00:00.086) 0:04:21.665 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 31 August 2024 18:39:53 -0400 (0:00:00.052) 0:04:21.717 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed_node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 31 August 2024 18:39:54 -0400 (0:00:00.099) 0:04:21.817 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 31 August 2024 18:39:54 -0400 (0:00:00.033) 0:04:21.851 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 31 August 2024 18:39:54 -0400 (0:00:00.032) 0:04:21.884 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 31 August 2024 18:39:54 -0400 (0:00:00.037) 0:04:21.922 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 31 August 2024 18:39:54 -0400 (0:00:00.047) 0:04:21.970 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 31 August 2024 18:39:54 -0400 (0:00:00.123) 0:04:22.093 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 31 August 2024 18:39:55 -0400 (0:00:01.343) 0:04:23.436 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 31 August 2024 18:39:55 -0400 (0:00:00.067) 0:04:23.503 ******* ok: [managed_node2] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 31 August 2024 18:39:55 -0400 (0:00:00.081) 0:04:23.585 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 31 August 2024 18:39:59 -0400 (0:00:03.975) 0:04:27.561 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 31 August 2024 18:39:59 -0400 (0:00:00.081) 0:04:27.642 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 31 August 2024 18:39:59 -0400 (0:00:00.039) 0:04:27.682 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 31 August 2024 18:39:59 -0400 (0:00:00.057) 0:04:27.739 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 31 August 2024 18:39:59 -0400 (0:00:00.034) 0:04:27.773 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 31 August 2024 18:40:00 -0400 (0:00:00.743) 0:04:28.516 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service": { "name": "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 31 August 2024 18:40:01 -0400 (0:00:01.098) 0:04:29.615 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 31 August 2024 18:40:01 -0400 (0:00:00.078) 0:04:29.693 ******* changed: [managed_node2] => (item=systemd-cryptsetup@luks\x2d93e795b2\x2dee2c\x2d4fdd\x2da9e2\x2d3e2d39be4ae8.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "name": "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target systemd-readahead-replay.service dev-sda1.device systemd-journald.socket system-systemd\\x2dcryptsetup.slice systemd-readahead-collect.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-93e795b2-ee2c-4fdd-a9e2-3e2d39be4ae8 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 31 August 2024 18:40:02 -0400 (0:00:00.649) 0:04:30.343 ******* fatal: [managed_node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Saturday 31 August 2024 18:40:06 -0400 (0:00:04.247) 0:04:34.591 ******* fatal: [managed_node2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'partition', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 31 August 2024 18:40:06 -0400 (0:00:00.051) 0:04:34.642 ******* changed: [managed_node2] => (item=systemd-cryptsetup@luks\x2d93e795b2\x2dee2c\x2d4fdd\x2da9e2\x2d3e2d39be4ae8.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "name": "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d93e795b2\\x2dee2c\\x2d4fdd\\x2da9e2\\x2d3e2d39be4ae8.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 31 August 2024 18:40:07 -0400 (0:00:00.521) 0:04:35.163 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 31 August 2024 18:40:07 -0400 (0:00:00.042) 0:04:35.205 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 31 August 2024 18:40:07 -0400 (0:00:00.049) 0:04:35.255 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 31 August 2024 18:40:07 -0400 (0:00:00.033) 0:04:35.289 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143993.3741283, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1725143993.3741283, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1725143993.3741283, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744071782462902", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 31 August 2024 18:40:07 -0400 (0:00:00.366) 0:04:35.656 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:293 Saturday 31 August 2024 18:40:07 -0400 (0:00:00.049) 0:04:35.705 ******* ok: [managed_node2] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testfArzOLlukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:300 Saturday 31 August 2024 18:40:08 -0400 (0:00:00.532) 0:04:36.238 ******* ok: [managed_node2] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testfArzOLlukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1725144008.47-2714-20618238241284/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:307 Saturday 31 August 2024 18:40:09 -0400 (0:00:00.858) 0:04:37.096 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 31 August 2024 18:40:09 -0400 (0:00:00.057) 0:04:37.153 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 31 August 2024 18:40:09 -0400 (0:00:00.076) 0:04:37.230 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 31 August 2024 18:40:09 -0400 (0:00:00.060) 0:04:37.290 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed_node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 31 August 2024 18:40:09 -0400 (0:00:00.124) 0:04:37.415 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 31 August 2024 18:40:09 -0400 (0:00:00.052) 0:04:37.467 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 31 August 2024 18:40:09 -0400 (0:00:00.051) 0:04:37.519 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 31 August 2024 18:40:09 -0400 (0:00:00.045) 0:04:37.564 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 31 August 2024 18:40:09 -0400 (0:00:00.043) 0:04:37.608 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 31 August 2024 18:40:09 -0400 (0:00:00.193) 0:04:37.802 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 31 August 2024 18:40:11 -0400 (0:00:01.348) 0:04:39.150 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testfArzOLlukskey", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 31 August 2024 18:40:11 -0400 (0:00:00.062) 0:04:39.213 ******* ok: [managed_node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 31 August 2024 18:40:11 -0400 (0:00:00.069) 0:04:39.283 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 31 August 2024 18:40:15 -0400 (0:00:04.059) 0:04:43.343 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 31 August 2024 18:40:15 -0400 (0:00:00.060) 0:04:43.403 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 31 August 2024 18:40:15 -0400 (0:00:00.031) 0:04:43.434 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 31 August 2024 18:40:15 -0400 (0:00:00.032) 0:04:43.466 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 31 August 2024 18:40:15 -0400 (0:00:00.031) 0:04:43.498 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 31 August 2024 18:40:16 -0400 (0:00:00.648) 0:04:44.147 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 31 August 2024 18:40:17 -0400 (0:00:01.099) 0:04:45.246 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 31 August 2024 18:40:17 -0400 (0:00:00.083) 0:04:45.330 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 31 August 2024 18:40:17 -0400 (0:00:00.056) 0:04:45.386 ******* changed: [managed_node2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 31 August 2024 18:40:28 -0400 (0:00:11.131) 0:04:56.518 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 31 August 2024 18:40:28 -0400 (0:00:00.051) 0:04:56.570 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143979.6698873, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "0b6c229b370f4f8368468043578bf1bf3f20ce39", "ctime": 1725143979.6668873, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263550, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1725143979.6668873, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1299, "uid": 0, "version": "18446744071584693682", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 31 August 2024 18:40:29 -0400 (0:00:00.429) 0:04:56.999 ******* ok: [managed_node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 31 August 2024 18:40:29 -0400 (0:00:00.408) 0:04:57.407 ******* TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 31 August 2024 18:40:29 -0400 (0:00:00.049) 0:04:57.457 ******* ok: [managed_node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 31 August 2024 18:40:29 -0400 (0:00:00.069) 0:04:57.527 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 31 August 2024 18:40:29 -0400 (0:00:00.063) 0:04:57.590 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 31 August 2024 18:40:29 -0400 (0:00:00.057) 0:04:57.648 ******* changed: [managed_node2] => (item={u'src': u'UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=0d98aff8-ed7f-4ffa-8d82-ae0b994268e2" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 31 August 2024 18:40:30 -0400 (0:00:00.432) 0:04:58.081 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 31 August 2024 18:40:30 -0400 (0:00:00.560) 0:04:58.642 ******* changed: [managed_node2] => (item={u'src': u'/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 31 August 2024 18:40:31 -0400 (0:00:00.442) 0:04:59.085 ******* skipping: [managed_node2] => (item={u'src': u'/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 31 August 2024 18:40:31 -0400 (0:00:00.046) 0:04:59.131 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 31 August 2024 18:40:31 -0400 (0:00:00.505) 0:04:59.637 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725143983.1659489, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1725143981.1029127, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 917511, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1725143981.1019125, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744071584696625", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 31 August 2024 18:40:32 -0400 (0:00:00.409) 0:05:00.047 ******* changed: [managed_node2] => (item={u'state': u'present', u'password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'name': u'luks-c2ec575e-3678-4666-8d31-df8dc94137b8', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 31 August 2024 18:40:32 -0400 (0:00:00.407) 0:05:00.454 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:324 Saturday 31 August 2024 18:40:33 -0400 (0:00:00.902) 0:05:01.357 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 31 August 2024 18:40:33 -0400 (0:00:00.086) 0:05:01.444 ******* ok: [managed_node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 31 August 2024 18:40:33 -0400 (0:00:00.065) 0:05:01.510 ******* skipping: [managed_node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 31 August 2024 18:40:33 -0400 (0:00:00.054) 0:05:01.564 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "size": "10G", "type": "crypt", "uuid": "8a395bdc-6748-43b2-900d-51e38807fbf4" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "c2ec575e-3678-4666-8d31-df8dc94137b8" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 31 August 2024 18:40:34 -0400 (0:00:00.525) 0:05:02.090 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003166", "end": "2024-08-31 18:40:34.561410", "rc": 0, "start": "2024-08-31 18:40:34.558244" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 31 August 2024 18:40:34 -0400 (0:00:00.361) 0:05:02.451 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003299", "end": "2024-08-31 18:40:34.915306", "failed_when_result": false, "rc": 0, "start": "2024-08-31 18:40:34.912007" } STDOUT: luks-c2ec575e-3678-4666-8d31-df8dc94137b8 /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 31 August 2024 18:40:34 -0400 (0:00:00.360) 0:05:02.812 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed_node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 31 August 2024 18:40:35 -0400 (0:00:00.116) 0:05:02.929 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 31 August 2024 18:40:35 -0400 (0:00:00.045) 0:05:02.974 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 31 August 2024 18:40:35 -0400 (0:00:00.046) 0:05:03.021 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 31 August 2024 18:40:35 -0400 (0:00:00.047) 0:05:03.068 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 31 August 2024 18:40:35 -0400 (0:00:00.091) 0:05:03.160 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 31 August 2024 18:40:35 -0400 (0:00:00.035) 0:05:03.195 ******* TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 31 August 2024 18:40:35 -0400 (0:00:00.050) 0:05:03.246 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 31 August 2024 18:40:35 -0400 (0:00:00.055) 0:05:03.301 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 31 August 2024 18:40:35 -0400 (0:00:00.052) 0:05:03.354 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 31 August 2024 18:40:35 -0400 (0:00:00.053) 0:05:03.407 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 31 August 2024 18:40:35 -0400 (0:00:00.051) 0:05:03.459 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 31 August 2024 18:40:35 -0400 (0:00:00.051) 0:05:03.511 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 31 August 2024 18:40:35 -0400 (0:00:00.061) 0:05:03.572 ******* TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 31 August 2024 18:40:35 -0400 (0:00:00.051) 0:05:03.623 ******* ok: [managed_node2] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.12.24 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 31 August 2024 18:40:36 -0400 (0:00:00.300) 0:05:03.924 ******* TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 31 August 2024 18:40:36 -0400 (0:00:00.047) 0:05:03.971 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed_node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 31 August 2024 18:40:36 -0400 (0:00:00.111) 0:05:04.083 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 31 August 2024 18:40:36 -0400 (0:00:00.057) 0:05:04.141 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 31 August 2024 18:40:36 -0400 (0:00:00.055) 0:05:04.196 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 31 August 2024 18:40:36 -0400 (0:00:00.048) 0:05:04.245 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 31 August 2024 18:40:36 -0400 (0:00:00.066) 0:05:04.311 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 31 August 2024 18:40:36 -0400 (0:00:00.071) 0:05:04.382 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 31 August 2024 18:40:36 -0400 (0:00:00.055) 0:05:04.437 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 31 August 2024 18:40:36 -0400 (0:00:00.050) 0:05:04.488 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 31 August 2024 18:40:36 -0400 (0:00:00.053) 0:05:04.541 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 31 August 2024 18:40:36 -0400 (0:00:00.054) 0:05:04.596 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 31 August 2024 18:40:36 -0400 (0:00:00.058) 0:05:04.654 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 31 August 2024 18:40:36 -0400 (0:00:00.054) 0:05:04.709 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed_node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 31 August 2024 18:40:37 -0400 (0:00:00.115) 0:05:04.825 ******* skipping: [managed_node2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 31 August 2024 18:40:37 -0400 (0:00:00.074) 0:05:04.900 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed_node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 31 August 2024 18:40:37 -0400 (0:00:00.123) 0:05:05.023 ******* skipping: [managed_node2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 31 August 2024 18:40:37 -0400 (0:00:00.078) 0:05:05.101 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 31 August 2024 18:40:37 -0400 (0:00:00.132) 0:05:05.234 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 31 August 2024 18:40:37 -0400 (0:00:00.066) 0:05:05.301 ******* TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 31 August 2024 18:40:37 -0400 (0:00:00.055) 0:05:05.356 ******* TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 31 August 2024 18:40:37 -0400 (0:00:00.052) 0:05:05.409 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 31 August 2024 18:40:37 -0400 (0:00:00.061) 0:05:05.471 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed_node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 31 August 2024 18:40:37 -0400 (0:00:00.192) 0:05:05.663 ******* skipping: [managed_node2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 31 August 2024 18:40:37 -0400 (0:00:00.077) 0:05:05.741 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed_node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 31 August 2024 18:40:38 -0400 (0:00:00.137) 0:05:05.878 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 31 August 2024 18:40:38 -0400 (0:00:00.054) 0:05:05.932 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 31 August 2024 18:40:38 -0400 (0:00:00.050) 0:05:05.983 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 31 August 2024 18:40:38 -0400 (0:00:00.050) 0:05:06.034 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 31 August 2024 18:40:38 -0400 (0:00:00.050) 0:05:06.084 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 31 August 2024 18:40:38 -0400 (0:00:00.043) 0:05:06.128 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 31 August 2024 18:40:38 -0400 (0:00:00.041) 0:05:06.169 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 31 August 2024 18:40:38 -0400 (0:00:00.047) 0:05:06.217 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed_node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 31 August 2024 18:40:38 -0400 (0:00:00.079) 0:05:06.296 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 31 August 2024 18:40:38 -0400 (0:00:00.043) 0:05:06.340 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed_node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 31 August 2024 18:40:38 -0400 (0:00:00.266) 0:05:06.606 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 31 August 2024 18:40:38 -0400 (0:00:00.060) 0:05:06.667 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 31 August 2024 18:40:38 -0400 (0:00:00.067) 0:05:06.734 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 31 August 2024 18:40:38 -0400 (0:00:00.054) 0:05:06.789 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 31 August 2024 18:40:39 -0400 (0:00:00.064) 0:05:06.854 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 31 August 2024 18:40:39 -0400 (0:00:00.055) 0:05:06.910 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 31 August 2024 18:40:39 -0400 (0:00:00.058) 0:05:06.968 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 31 August 2024 18:40:39 -0400 (0:00:00.050) 0:05:07.018 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 31 August 2024 18:40:39 -0400 (0:00:00.040) 0:05:07.059 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 31 August 2024 18:40:39 -0400 (0:00:00.040) 0:05:07.099 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 31 August 2024 18:40:39 -0400 (0:00:00.041) 0:05:07.141 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 31 August 2024 18:40:39 -0400 (0:00:00.038) 0:05:07.180 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 31 August 2024 18:40:39 -0400 (0:00:00.064) 0:05:07.244 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 31 August 2024 18:40:39 -0400 (0:00:00.040) 0:05:07.285 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 31 August 2024 18:40:39 -0400 (0:00:00.048) 0:05:07.333 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 31 August 2024 18:40:39 -0400 (0:00:00.050) 0:05:07.384 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 31 August 2024 18:40:39 -0400 (0:00:00.058) 0:05:07.443 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 31 August 2024 18:40:39 -0400 (0:00:00.054) 0:05:07.498 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 31 August 2024 18:40:39 -0400 (0:00:00.073) 0:05:07.571 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 31 August 2024 18:40:39 -0400 (0:00:00.084) 0:05:07.655 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144028.3957446, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1725144028.3957446, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 215786, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1725144028.3957446, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 31 August 2024 18:40:40 -0400 (0:00:00.435) 0:05:08.091 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 31 August 2024 18:40:40 -0400 (0:00:00.072) 0:05:08.163 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 31 August 2024 18:40:40 -0400 (0:00:00.049) 0:05:08.213 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 31 August 2024 18:40:40 -0400 (0:00:00.061) 0:05:08.274 ******* ok: [managed_node2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 31 August 2024 18:40:40 -0400 (0:00:00.057) 0:05:08.332 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 31 August 2024 18:40:40 -0400 (0:00:00.051) 0:05:08.383 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 31 August 2024 18:40:40 -0400 (0:00:00.059) 0:05:08.442 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144028.5537474, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1725144028.5537474, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 215827, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1725144028.5537474, "nlink": 1, "path": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 31 August 2024 18:40:41 -0400 (0:00:00.412) 0:05:08.855 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 31 August 2024 18:40:41 -0400 (0:00:00.764) 0:05:09.619 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.028572", "end": "2024-08-31 18:40:42.145288", "rc": 0, "start": "2024-08-31 18:40:42.116716" } STDOUT: LUKS header information for /dev/sda1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 58 17 c7 5c ee b2 14 8b ac 8c d2 1f 85 8f aa 4c 22 ee fd 38 MK salt: e3 f7 10 1d c4 03 3c ef 32 91 3b 4a 98 04 ee ca cc d8 24 20 96 64 16 74 f2 95 f3 1e 87 87 74 1f MK iterations: 20557 UUID: c2ec575e-3678-4666-8d31-df8dc94137b8 Key Slot 0: ENABLED Iterations: 328500 Salt: fc e5 77 2b 3b 0e 19 34 d2 e1 d6 cc 31 a5 5c 98 cb 9f 1a 56 ca 9d aa 73 7a 65 46 6d 67 2d 29 9b Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 31 August 2024 18:40:42 -0400 (0:00:00.513) 0:05:10.133 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 31 August 2024 18:40:42 -0400 (0:00:00.069) 0:05:10.203 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 31 August 2024 18:40:42 -0400 (0:00:00.073) 0:05:10.276 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 31 August 2024 18:40:42 -0400 (0:00:00.077) 0:05:10.353 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 31 August 2024 18:40:42 -0400 (0:00:00.102) 0:05:10.456 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 31 August 2024 18:40:42 -0400 (0:00:00.060) 0:05:10.517 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 31 August 2024 18:40:42 -0400 (0:00:00.052) 0:05:10.570 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 31 August 2024 18:40:42 -0400 (0:00:00.057) 0:05:10.627 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-c2ec575e-3678-4666-8d31-df8dc94137b8 /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 31 August 2024 18:40:42 -0400 (0:00:00.051) 0:05:10.679 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 31 August 2024 18:40:42 -0400 (0:00:00.063) 0:05:10.742 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 31 August 2024 18:40:42 -0400 (0:00:00.061) 0:05:10.803 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.053) 0:05:10.857 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.044) 0:05:10.902 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.035) 0:05:10.938 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.033) 0:05:10.972 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.033) 0:05:11.006 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.033) 0:05:11.039 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.036) 0:05:11.075 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.033) 0:05:11.108 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.033) 0:05:11.142 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.032) 0:05:11.175 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.032) 0:05:11.208 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.032) 0:05:11.241 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.036) 0:05:11.277 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.036) 0:05:11.313 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.036) 0:05:11.349 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.037) 0:05:11.387 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.038) 0:05:11.425 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.036) 0:05:11.461 ******* skipping: [managed_node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.038) 0:05:11.500 ******* skipping: [managed_node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.035) 0:05:11.535 ******* skipping: [managed_node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.035) 0:05:11.571 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.035) 0:05:11.606 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.034) 0:05:11.641 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.032) 0:05:11.673 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.036) 0:05:11.710 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.032) 0:05:11.742 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.032) 0:05:11.775 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 31 August 2024 18:40:43 -0400 (0:00:00.039) 0:05:11.815 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.050) 0:05:11.865 ******* skipping: [managed_node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.065) 0:05:11.931 ******* skipping: [managed_node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.056) 0:05:11.987 ******* skipping: [managed_node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.052) 0:05:12.040 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.053) 0:05:12.094 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.051) 0:05:12.145 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.050) 0:05:12.195 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.048) 0:05:12.244 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.043) 0:05:12.288 ******* ok: [managed_node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.069) 0:05:12.358 ******* ok: [managed_node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.053) 0:05:12.411 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.042) 0:05:12.454 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.034) 0:05:12.488 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.032) 0:05:12.521 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.035) 0:05:12.557 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.033) 0:05:12.590 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.033) 0:05:12.623 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.032) 0:05:12.656 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.033) 0:05:12.690 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.033) 0:05:12.724 ******* TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.032) 0:05:12.757 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:327 Saturday 31 August 2024 18:40:44 -0400 (0:00:00.032) 0:05:12.790 ******* ok: [managed_node2] => { "changed": false, "path": "/tmp/storage_testfArzOLlukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:337 Saturday 31 August 2024 18:40:45 -0400 (0:00:00.342) 0:05:13.132 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed_node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 31 August 2024 18:40:45 -0400 (0:00:00.100) 0:05:13.233 ******* ok: [managed_node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 31 August 2024 18:40:45 -0400 (0:00:00.041) 0:05:13.275 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 31 August 2024 18:40:45 -0400 (0:00:00.050) 0:05:13.325 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 31 August 2024 18:40:45 -0400 (0:00:00.053) 0:05:13.379 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 31 August 2024 18:40:45 -0400 (0:00:00.041) 0:05:13.420 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed_node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 31 August 2024 18:40:45 -0400 (0:00:00.082) 0:05:13.503 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 31 August 2024 18:40:45 -0400 (0:00:00.032) 0:05:13.536 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 31 August 2024 18:40:45 -0400 (0:00:00.034) 0:05:13.570 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 31 August 2024 18:40:45 -0400 (0:00:00.034) 0:05:13.605 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 31 August 2024 18:40:45 -0400 (0:00:00.034) 0:05:13.639 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 31 August 2024 18:40:45 -0400 (0:00:00.085) 0:05:13.725 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 31 August 2024 18:40:47 -0400 (0:00:01.391) 0:05:15.117 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 31 August 2024 18:40:47 -0400 (0:00:00.043) 0:05:15.160 ******* ok: [managed_node2] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 31 August 2024 18:40:47 -0400 (0:00:00.041) 0:05:15.202 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 31 August 2024 18:40:51 -0400 (0:00:04.346) 0:05:19.548 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 31 August 2024 18:40:51 -0400 (0:00:00.083) 0:05:19.631 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 31 August 2024 18:40:51 -0400 (0:00:00.040) 0:05:19.672 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 31 August 2024 18:40:51 -0400 (0:00:00.044) 0:05:19.716 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 31 August 2024 18:40:51 -0400 (0:00:00.033) 0:05:19.750 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 31 August 2024 18:40:52 -0400 (0:00:00.783) 0:05:20.533 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 31 August 2024 18:40:53 -0400 (0:00:01.255) 0:05:21.788 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 31 August 2024 18:40:54 -0400 (0:00:00.096) 0:05:21.885 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 31 August 2024 18:40:54 -0400 (0:00:00.053) 0:05:21.938 ******* fatal: [managed_node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Saturday 31 August 2024 18:40:58 -0400 (0:00:04.067) 0:05:26.005 ******* fatal: [managed_node2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': False, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'lvm', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'test1' missing key/password"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 31 August 2024 18:40:58 -0400 (0:00:00.098) 0:05:26.104 ******* TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 31 August 2024 18:40:58 -0400 (0:00:00.096) 0:05:26.201 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 31 August 2024 18:40:58 -0400 (0:00:00.097) 0:05:26.298 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 31 August 2024 18:40:58 -0400 (0:00:00.157) 0:05:26.456 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:355 Saturday 31 August 2024 18:40:58 -0400 (0:00:00.125) 0:05:26.581 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 31 August 2024 18:40:58 -0400 (0:00:00.209) 0:05:26.791 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 31 August 2024 18:40:59 -0400 (0:00:00.186) 0:05:26.977 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 31 August 2024 18:40:59 -0400 (0:00:00.063) 0:05:27.041 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed_node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 31 August 2024 18:40:59 -0400 (0:00:00.132) 0:05:27.173 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 31 August 2024 18:40:59 -0400 (0:00:00.066) 0:05:27.239 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 31 August 2024 18:40:59 -0400 (0:00:00.066) 0:05:27.306 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 31 August 2024 18:40:59 -0400 (0:00:00.064) 0:05:27.371 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 31 August 2024 18:40:59 -0400 (0:00:00.067) 0:05:27.438 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 31 August 2024 18:40:59 -0400 (0:00:00.130) 0:05:27.568 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 31 August 2024 18:41:01 -0400 (0:00:01.678) 0:05:29.247 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 31 August 2024 18:41:01 -0400 (0:00:00.092) 0:05:29.339 ******* ok: [managed_node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 31 August 2024 18:41:01 -0400 (0:00:00.070) 0:05:29.409 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 31 August 2024 18:41:05 -0400 (0:00:04.402) 0:05:33.812 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 31 August 2024 18:41:06 -0400 (0:00:00.119) 0:05:33.931 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 31 August 2024 18:41:06 -0400 (0:00:00.067) 0:05:33.999 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 31 August 2024 18:41:06 -0400 (0:00:00.075) 0:05:34.075 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 31 August 2024 18:41:06 -0400 (0:00:00.052) 0:05:34.127 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 31 August 2024 18:41:07 -0400 (0:00:01.043) 0:05:35.171 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 31 August 2024 18:41:08 -0400 (0:00:01.462) 0:05:36.634 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 31 August 2024 18:41:08 -0400 (0:00:00.131) 0:05:36.765 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 31 August 2024 18:41:09 -0400 (0:00:00.103) 0:05:36.869 ******* changed: [managed_node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-a4395052-edbc-405e-bf63-64a8c90ae727", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 31 August 2024 18:41:20 -0400 (0:00:11.720) 0:05:48.590 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 31 August 2024 18:41:20 -0400 (0:00:00.033) 0:05:48.623 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144031.1637933, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "8a62ce38cc804e13ad91f9f8a6ff5b094501f27f", "ctime": 1725144031.1607933, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263550, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1725144031.1607933, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744071584693682", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 31 August 2024 18:41:21 -0400 (0:00:00.344) 0:05:48.968 ******* ok: [managed_node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 31 August 2024 18:41:21 -0400 (0:00:00.360) 0:05:49.328 ******* TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 31 August 2024 18:41:21 -0400 (0:00:00.033) 0:05:49.362 ******* ok: [managed_node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-a4395052-edbc-405e-bf63-64a8c90ae727", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 31 August 2024 18:41:21 -0400 (0:00:00.051) 0:05:49.414 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 31 August 2024 18:41:21 -0400 (0:00:00.046) 0:05:49.461 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 31 August 2024 18:41:21 -0400 (0:00:00.042) 0:05:49.503 ******* changed: [managed_node2] => (item={u'src': u'/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-c2ec575e-3678-4666-8d31-df8dc94137b8" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 31 August 2024 18:41:22 -0400 (0:00:00.381) 0:05:49.884 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 31 August 2024 18:41:22 -0400 (0:00:00.511) 0:05:50.395 ******* changed: [managed_node2] => (item={u'src': u'/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 31 August 2024 18:41:22 -0400 (0:00:00.374) 0:05:50.770 ******* skipping: [managed_node2] => (item={u'src': u'/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 31 August 2024 18:41:23 -0400 (0:00:00.047) 0:05:50.817 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 31 August 2024 18:41:23 -0400 (0:00:00.514) 0:05:51.332 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144034.9138594, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "fd5b15a6bf96fb94ad129bcfab5df1f877344918", "ctime": 1725144032.5478177, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 917512, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1725144032.5468178, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 88, "uid": 0, "version": "18446744071584696804", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 31 August 2024 18:41:23 -0400 (0:00:00.334) 0:05:51.666 ******* changed: [managed_node2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-c2ec575e-3678-4666-8d31-df8dc94137b8', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed_node2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-a4395052-edbc-405e-bf63-64a8c90ae727', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-a4395052-edbc-405e-bf63-64a8c90ae727", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 31 August 2024 18:41:24 -0400 (0:00:00.678) 0:05:52.345 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:374 Saturday 31 August 2024 18:41:25 -0400 (0:00:00.758) 0:05:53.104 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 31 August 2024 18:41:25 -0400 (0:00:00.096) 0:05:53.200 ******* ok: [managed_node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 31 August 2024 18:41:25 -0400 (0:00:00.072) 0:05:53.273 ******* skipping: [managed_node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 31 August 2024 18:41:25 -0400 (0:00:00.049) 0:05:53.323 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "a4395052-edbc-405e-bf63-64a8c90ae727" }, "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "size": "4G", "type": "crypt", "uuid": "0cdc64a0-04e9-4033-bbf7-f44812397b4d" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "e3t0qk-oSfp-di6k-1NWD-tOt2-EOV5-cOetCf" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 31 August 2024 18:41:25 -0400 (0:00:00.387) 0:05:53.711 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003358", "end": "2024-08-31 18:41:26.182750", "rc": 0, "start": "2024-08-31 18:41:26.179392" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 31 August 2024 18:41:26 -0400 (0:00:00.366) 0:05:54.078 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003350", "end": "2024-08-31 18:41:26.556649", "failed_when_result": false, "rc": 0, "start": "2024-08-31 18:41:26.553299" } STDOUT: luks-a4395052-edbc-405e-bf63-64a8c90ae727 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 31 August 2024 18:41:26 -0400 (0:00:00.390) 0:05:54.468 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed_node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 31 August 2024 18:41:26 -0400 (0:00:00.117) 0:05:54.585 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 31 August 2024 18:41:26 -0400 (0:00:00.053) 0:05:54.638 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.022579", "end": "2024-08-31 18:41:27.163772", "rc": 0, "start": "2024-08-31 18:41:27.141193" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 31 August 2024 18:41:27 -0400 (0:00:00.418) 0:05:55.057 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 31 August 2024 18:41:27 -0400 (0:00:00.057) 0:05:55.115 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 31 August 2024 18:41:27 -0400 (0:00:00.095) 0:05:55.210 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 31 August 2024 18:41:27 -0400 (0:00:00.045) 0:05:55.256 ******* ok: [managed_node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 31 August 2024 18:41:28 -0400 (0:00:00.568) 0:05:55.824 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 31 August 2024 18:41:28 -0400 (0:00:00.040) 0:05:55.865 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 31 August 2024 18:41:28 -0400 (0:00:00.041) 0:05:55.906 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 31 August 2024 18:41:28 -0400 (0:00:00.042) 0:05:55.949 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 31 August 2024 18:41:28 -0400 (0:00:00.039) 0:05:55.988 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 31 August 2024 18:41:28 -0400 (0:00:00.039) 0:05:56.027 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 31 August 2024 18:41:28 -0400 (0:00:00.035) 0:05:56.063 ******* ok: [managed_node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 31 August 2024 18:41:28 -0400 (0:00:00.052) 0:05:56.115 ******* ok: [managed_node2] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.12.24 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 31 August 2024 18:41:28 -0400 (0:00:00.277) 0:05:56.393 ******* skipping: [managed_node2] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 31 August 2024 18:41:28 -0400 (0:00:00.063) 0:05:56.457 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed_node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 31 August 2024 18:41:28 -0400 (0:00:00.116) 0:05:56.573 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 31 August 2024 18:41:28 -0400 (0:00:00.069) 0:05:56.643 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 31 August 2024 18:41:28 -0400 (0:00:00.054) 0:05:56.697 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 31 August 2024 18:41:28 -0400 (0:00:00.082) 0:05:56.779 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 31 August 2024 18:41:29 -0400 (0:00:00.096) 0:05:56.876 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 31 August 2024 18:41:29 -0400 (0:00:00.048) 0:05:56.925 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 31 August 2024 18:41:29 -0400 (0:00:00.054) 0:05:56.979 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 31 August 2024 18:41:29 -0400 (0:00:00.057) 0:05:57.037 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 31 August 2024 18:41:29 -0400 (0:00:00.044) 0:05:57.081 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 31 August 2024 18:41:29 -0400 (0:00:00.033) 0:05:57.115 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 31 August 2024 18:41:29 -0400 (0:00:00.037) 0:05:57.153 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 31 August 2024 18:41:29 -0400 (0:00:00.042) 0:05:57.195 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed_node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 31 August 2024 18:41:29 -0400 (0:00:00.153) 0:05:57.348 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed_node2 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Saturday 31 August 2024 18:41:29 -0400 (0:00:00.190) 0:05:57.539 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Saturday 31 August 2024 18:41:29 -0400 (0:00:00.044) 0:05:57.584 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Saturday 31 August 2024 18:41:29 -0400 (0:00:00.039) 0:05:57.623 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Saturday 31 August 2024 18:41:29 -0400 (0:00:00.039) 0:05:57.663 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Saturday 31 August 2024 18:41:29 -0400 (0:00:00.046) 0:05:57.709 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Saturday 31 August 2024 18:41:29 -0400 (0:00:00.069) 0:05:57.778 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Saturday 31 August 2024 18:41:30 -0400 (0:00:00.066) 0:05:57.845 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 31 August 2024 18:41:30 -0400 (0:00:00.052) 0:05:57.897 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed_node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 31 August 2024 18:41:30 -0400 (0:00:00.151) 0:05:58.049 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed_node2 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Saturday 31 August 2024 18:41:30 -0400 (0:00:00.125) 0:05:58.175 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Saturday 31 August 2024 18:41:30 -0400 (0:00:00.049) 0:05:58.224 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Saturday 31 August 2024 18:41:30 -0400 (0:00:00.045) 0:05:58.270 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Saturday 31 August 2024 18:41:30 -0400 (0:00:00.039) 0:05:58.310 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 31 August 2024 18:41:30 -0400 (0:00:00.039) 0:05:58.349 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 31 August 2024 18:41:30 -0400 (0:00:00.129) 0:05:58.479 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 31 August 2024 18:41:30 -0400 (0:00:00.100) 0:05:58.579 ******* skipping: [managed_node2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 31 August 2024 18:41:30 -0400 (0:00:00.067) 0:05:58.647 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed_node2 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Saturday 31 August 2024 18:41:30 -0400 (0:00:00.129) 0:05:58.777 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Saturday 31 August 2024 18:41:31 -0400 (0:00:00.086) 0:05:58.863 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Saturday 31 August 2024 18:41:31 -0400 (0:00:00.072) 0:05:58.936 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Saturday 31 August 2024 18:41:31 -0400 (0:00:00.052) 0:05:58.988 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Saturday 31 August 2024 18:41:31 -0400 (0:00:00.049) 0:05:59.037 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Saturday 31 August 2024 18:41:31 -0400 (0:00:00.053) 0:05:59.090 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 31 August 2024 18:41:31 -0400 (0:00:00.050) 0:05:59.141 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 31 August 2024 18:41:31 -0400 (0:00:00.059) 0:05:59.201 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed_node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 31 August 2024 18:41:31 -0400 (0:00:00.200) 0:05:59.402 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed_node2 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Saturday 31 August 2024 18:41:31 -0400 (0:00:00.159) 0:05:59.562 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Saturday 31 August 2024 18:41:31 -0400 (0:00:00.061) 0:05:59.623 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Saturday 31 August 2024 18:41:31 -0400 (0:00:00.054) 0:05:59.677 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Saturday 31 August 2024 18:41:31 -0400 (0:00:00.059) 0:05:59.737 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Saturday 31 August 2024 18:41:32 -0400 (0:00:00.082) 0:05:59.819 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Saturday 31 August 2024 18:41:32 -0400 (0:00:00.052) 0:05:59.872 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Saturday 31 August 2024 18:41:32 -0400 (0:00:00.068) 0:05:59.940 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 31 August 2024 18:41:32 -0400 (0:00:00.050) 0:05:59.990 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed_node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 31 August 2024 18:41:32 -0400 (0:00:00.137) 0:06:00.128 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 31 August 2024 18:41:32 -0400 (0:00:00.054) 0:06:00.182 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 31 August 2024 18:41:32 -0400 (0:00:00.063) 0:06:00.245 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 31 August 2024 18:41:32 -0400 (0:00:00.060) 0:06:00.306 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 31 August 2024 18:41:32 -0400 (0:00:00.050) 0:06:00.357 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 31 August 2024 18:41:32 -0400 (0:00:00.062) 0:06:00.419 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 31 August 2024 18:41:32 -0400 (0:00:00.058) 0:06:00.478 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 31 August 2024 18:41:32 -0400 (0:00:00.065) 0:06:00.543 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed_node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 31 August 2024 18:41:32 -0400 (0:00:00.120) 0:06:00.663 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 31 August 2024 18:41:32 -0400 (0:00:00.076) 0:06:00.739 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed_node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 31 August 2024 18:41:33 -0400 (0:00:00.386) 0:06:01.126 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 31 August 2024 18:41:33 -0400 (0:00:00.055) 0:06:01.182 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 31 August 2024 18:41:33 -0400 (0:00:00.062) 0:06:01.244 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 31 August 2024 18:41:33 -0400 (0:00:00.054) 0:06:01.299 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 31 August 2024 18:41:33 -0400 (0:00:00.076) 0:06:01.375 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 31 August 2024 18:41:33 -0400 (0:00:00.051) 0:06:01.426 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 31 August 2024 18:41:33 -0400 (0:00:00.054) 0:06:01.481 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 31 August 2024 18:41:33 -0400 (0:00:00.051) 0:06:01.533 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 31 August 2024 18:41:33 -0400 (0:00:00.050) 0:06:01.583 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 31 August 2024 18:41:33 -0400 (0:00:00.054) 0:06:01.638 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 31 August 2024 18:41:33 -0400 (0:00:00.069) 0:06:01.707 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 31 August 2024 18:41:33 -0400 (0:00:00.073) 0:06:01.781 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 31 August 2024 18:41:34 -0400 (0:00:00.178) 0:06:01.959 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 31 August 2024 18:41:34 -0400 (0:00:00.108) 0:06:02.068 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 31 August 2024 18:41:34 -0400 (0:00:00.130) 0:06:02.199 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 31 August 2024 18:41:34 -0400 (0:00:00.051) 0:06:02.251 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 31 August 2024 18:41:34 -0400 (0:00:00.069) 0:06:02.320 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 31 August 2024 18:41:34 -0400 (0:00:00.059) 0:06:02.380 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 31 August 2024 18:41:34 -0400 (0:00:00.076) 0:06:02.457 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 31 August 2024 18:41:34 -0400 (0:00:00.076) 0:06:02.533 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144080.4156609, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1725144080.4156609, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 226674, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1725144080.4156609, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 31 August 2024 18:41:35 -0400 (0:00:00.451) 0:06:02.985 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 31 August 2024 18:41:35 -0400 (0:00:00.075) 0:06:03.061 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 31 August 2024 18:41:35 -0400 (0:00:00.057) 0:06:03.119 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 31 August 2024 18:41:35 -0400 (0:00:00.077) 0:06:03.196 ******* ok: [managed_node2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 31 August 2024 18:41:35 -0400 (0:00:00.128) 0:06:03.325 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 31 August 2024 18:41:35 -0400 (0:00:00.072) 0:06:03.398 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 31 August 2024 18:41:35 -0400 (0:00:00.063) 0:06:03.461 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144080.652665, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1725144080.652665, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 226751, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1725144080.652665, "nlink": 1, "path": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 31 August 2024 18:41:36 -0400 (0:00:00.518) 0:06:03.980 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 31 August 2024 18:41:37 -0400 (0:00:01.224) 0:06:05.204 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.028532", "end": "2024-08-31 18:41:37.879288", "rc": 0, "start": "2024-08-31 18:41:37.850756" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 51 4c 12 ca b1 dc cf 4c a2 0e 73 50 3e b1 04 4c 98 9a 71 68 MK salt: 41 8b c8 4b 78 92 88 58 fd 30 8b 7a 56 09 11 07 af 0f 9a c9 35 7d 04 e8 05 91 bd 4c 65 77 d2 75 MK iterations: 20378 UUID: a4395052-edbc-405e-bf63-64a8c90ae727 Key Slot 0: ENABLED Iterations: 325240 Salt: 86 cf 2d d5 21 5a 24 1c 83 ae f6 bc cd b8 3e 0c 04 95 a8 82 85 eb 6a 3f 82 a1 37 a3 63 eb a9 5e Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 31 August 2024 18:41:37 -0400 (0:00:00.593) 0:06:05.797 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 31 August 2024 18:41:38 -0400 (0:00:00.090) 0:06:05.888 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 31 August 2024 18:41:38 -0400 (0:00:00.087) 0:06:05.975 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 31 August 2024 18:41:38 -0400 (0:00:00.071) 0:06:06.047 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 31 August 2024 18:41:38 -0400 (0:00:00.070) 0:06:06.118 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 31 August 2024 18:41:38 -0400 (0:00:00.125) 0:06:06.244 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 31 August 2024 18:41:38 -0400 (0:00:00.108) 0:06:06.352 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 31 August 2024 18:41:38 -0400 (0:00:00.110) 0:06:06.463 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-a4395052-edbc-405e-bf63-64a8c90ae727 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 31 August 2024 18:41:38 -0400 (0:00:00.133) 0:06:06.596 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 31 August 2024 18:41:38 -0400 (0:00:00.074) 0:06:06.671 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 31 August 2024 18:41:38 -0400 (0:00:00.072) 0:06:06.744 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 31 August 2024 18:41:38 -0400 (0:00:00.068) 0:06:06.813 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 31 August 2024 18:41:39 -0400 (0:00:00.070) 0:06:06.883 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 31 August 2024 18:41:39 -0400 (0:00:00.056) 0:06:06.939 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 31 August 2024 18:41:39 -0400 (0:00:00.100) 0:06:07.040 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 31 August 2024 18:41:39 -0400 (0:00:00.118) 0:06:07.159 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 31 August 2024 18:41:39 -0400 (0:00:00.064) 0:06:07.223 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 31 August 2024 18:41:39 -0400 (0:00:00.075) 0:06:07.299 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 31 August 2024 18:41:39 -0400 (0:00:00.071) 0:06:07.370 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 31 August 2024 18:41:39 -0400 (0:00:00.080) 0:06:07.451 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 31 August 2024 18:41:39 -0400 (0:00:00.056) 0:06:07.508 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 31 August 2024 18:41:39 -0400 (0:00:00.054) 0:06:07.563 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 31 August 2024 18:41:39 -0400 (0:00:00.053) 0:06:07.616 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 31 August 2024 18:41:39 -0400 (0:00:00.058) 0:06:07.675 ******* ok: [managed_node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 31 August 2024 18:41:40 -0400 (0:00:00.999) 0:06:08.675 ******* ok: [managed_node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 31 August 2024 18:41:41 -0400 (0:00:00.693) 0:06:09.368 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 31 August 2024 18:41:41 -0400 (0:00:00.163) 0:06:09.532 ******* ok: [managed_node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 31 August 2024 18:41:41 -0400 (0:00:00.206) 0:06:09.739 ******* ok: [managed_node2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 31 August 2024 18:41:42 -0400 (0:00:00.706) 0:06:10.445 ******* skipping: [managed_node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 31 August 2024 18:41:42 -0400 (0:00:00.094) 0:06:10.540 ******* skipping: [managed_node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 31 August 2024 18:41:42 -0400 (0:00:00.133) 0:06:10.674 ******* skipping: [managed_node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 31 August 2024 18:41:42 -0400 (0:00:00.128) 0:06:10.803 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 31 August 2024 18:41:43 -0400 (0:00:00.133) 0:06:10.936 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 31 August 2024 18:41:43 -0400 (0:00:00.128) 0:06:11.064 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 31 August 2024 18:41:43 -0400 (0:00:00.117) 0:06:11.182 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 31 August 2024 18:41:43 -0400 (0:00:00.091) 0:06:11.273 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 31 August 2024 18:41:43 -0400 (0:00:00.141) 0:06:11.415 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 31 August 2024 18:41:43 -0400 (0:00:00.127) 0:06:11.543 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 31 August 2024 18:41:43 -0400 (0:00:00.079) 0:06:11.622 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 31 August 2024 18:41:43 -0400 (0:00:00.097) 0:06:11.720 ******* skipping: [managed_node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 31 August 2024 18:41:43 -0400 (0:00:00.084) 0:06:11.805 ******* skipping: [managed_node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 31 August 2024 18:41:44 -0400 (0:00:00.054) 0:06:11.860 ******* skipping: [managed_node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 31 August 2024 18:41:44 -0400 (0:00:00.071) 0:06:11.931 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 31 August 2024 18:41:44 -0400 (0:00:00.106) 0:06:12.038 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 31 August 2024 18:41:44 -0400 (0:00:00.085) 0:06:12.124 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 31 August 2024 18:41:44 -0400 (0:00:00.130) 0:06:12.255 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 31 August 2024 18:41:44 -0400 (0:00:00.090) 0:06:12.345 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 31 August 2024 18:41:44 -0400 (0:00:00.117) 0:06:12.463 ******* ok: [managed_node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 31 August 2024 18:41:44 -0400 (0:00:00.065) 0:06:12.528 ******* ok: [managed_node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 31 August 2024 18:41:44 -0400 (0:00:00.061) 0:06:12.590 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 31 August 2024 18:41:44 -0400 (0:00:00.109) 0:06:12.699 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.021924", "end": "2024-08-31 18:41:45.516991", "rc": 0, "start": "2024-08-31 18:41:45.495067" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 31 August 2024 18:41:45 -0400 (0:00:00.796) 0:06:13.495 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 31 August 2024 18:41:45 -0400 (0:00:00.084) 0:06:13.580 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 31 August 2024 18:41:45 -0400 (0:00:00.067) 0:06:13.648 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 31 August 2024 18:41:45 -0400 (0:00:00.058) 0:06:13.706 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 31 August 2024 18:41:45 -0400 (0:00:00.051) 0:06:13.758 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 31 August 2024 18:41:46 -0400 (0:00:00.057) 0:06:13.815 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 31 August 2024 18:41:46 -0400 (0:00:00.052) 0:06:13.868 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 31 August 2024 18:41:46 -0400 (0:00:00.046) 0:06:13.915 ******* TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 31 August 2024 18:41:46 -0400 (0:00:00.049) 0:06:13.964 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:377 Saturday 31 August 2024 18:41:46 -0400 (0:00:00.059) 0:06:14.024 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 31 August 2024 18:41:46 -0400 (0:00:00.127) 0:06:14.151 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 31 August 2024 18:41:46 -0400 (0:00:00.089) 0:06:14.241 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 31 August 2024 18:41:46 -0400 (0:00:00.060) 0:06:14.301 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed_node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 31 August 2024 18:41:46 -0400 (0:00:00.127) 0:06:14.429 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 31 August 2024 18:41:46 -0400 (0:00:00.053) 0:06:14.483 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 31 August 2024 18:41:46 -0400 (0:00:00.053) 0:06:14.536 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 31 August 2024 18:41:46 -0400 (0:00:00.062) 0:06:14.599 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 31 August 2024 18:41:46 -0400 (0:00:00.056) 0:06:14.655 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 31 August 2024 18:41:46 -0400 (0:00:00.127) 0:06:14.783 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 31 August 2024 18:41:48 -0400 (0:00:01.315) 0:06:16.099 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 31 August 2024 18:41:48 -0400 (0:00:00.064) 0:06:16.163 ******* ok: [managed_node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 31 August 2024 18:41:48 -0400 (0:00:00.057) 0:06:16.221 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 31 August 2024 18:41:52 -0400 (0:00:04.306) 0:06:20.527 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 31 August 2024 18:41:52 -0400 (0:00:00.059) 0:06:20.587 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 31 August 2024 18:41:52 -0400 (0:00:00.084) 0:06:20.672 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 31 August 2024 18:41:52 -0400 (0:00:00.034) 0:06:20.707 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 31 August 2024 18:41:52 -0400 (0:00:00.030) 0:06:20.737 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 31 August 2024 18:41:53 -0400 (0:00:00.689) 0:06:21.426 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2dc2ec575e\\x2d3678\\x2d4666\\x2d8d31\\x2ddf8dc94137b8.service": { "name": "systemd-cryptsetup@luks\\x2dc2ec575e\\x2d3678\\x2d4666\\x2d8d31\\x2ddf8dc94137b8.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 31 August 2024 18:41:54 -0400 (0:00:01.098) 0:06:22.525 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dc2ec575e\\x2d3678\\x2d4666\\x2d8d31\\x2ddf8dc94137b8.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 31 August 2024 18:41:54 -0400 (0:00:00.064) 0:06:22.589 ******* changed: [managed_node2] => (item=systemd-cryptsetup@luks\x2dc2ec575e\x2d3678\x2d4666\x2d8d31\x2ddf8dc94137b8.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dc2ec575e\\x2d3678\\x2d4666\\x2d8d31\\x2ddf8dc94137b8.service", "name": "systemd-cryptsetup@luks\\x2dc2ec575e\\x2d3678\\x2d4666\\x2d8d31\\x2ddf8dc94137b8.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target systemd-journald.socket systemd-readahead-replay.service systemd-readahead-collect.service dev-sda1.device", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-c2ec575e-3678-4666-8d31-df8dc94137b8", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-c2ec575e-3678-4666-8d31-df8dc94137b8 /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-c2ec575e-3678-4666-8d31-df8dc94137b8 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dc2ec575e\\x2d3678\\x2d4666\\x2d8d31\\x2ddf8dc94137b8.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2dc2ec575e\\x2d3678\\x2d4666\\x2d8d31\\x2ddf8dc94137b8.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2dc2ec575e\\x2d3678\\x2d4666\\x2d8d31\\x2ddf8dc94137b8.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 31 August 2024 18:41:55 -0400 (0:00:00.586) 0:06:23.176 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 31 August 2024 18:41:59 -0400 (0:00:04.469) 0:06:27.645 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 31 August 2024 18:41:59 -0400 (0:00:00.053) 0:06:27.699 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144082.871704, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "39ffbb971a993003597f82eb722c040922dc1098", "ctime": 1725144082.8687038, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263550, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1725144082.8687038, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744071584693682", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 31 August 2024 18:42:00 -0400 (0:00:00.396) 0:06:28.095 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 31 August 2024 18:42:00 -0400 (0:00:00.038) 0:06:28.133 ******* changed: [managed_node2] => (item=systemd-cryptsetup@luks\x2dc2ec575e\x2d3678\x2d4666\x2d8d31\x2ddf8dc94137b8.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dc2ec575e\\x2d3678\\x2d4666\\x2d8d31\\x2ddf8dc94137b8.service", "name": "systemd-cryptsetup@luks\\x2dc2ec575e\\x2d3678\\x2d4666\\x2d8d31\\x2ddf8dc94137b8.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dc2ec575e\\x2d3678\\x2d4666\\x2d8d31\\x2ddf8dc94137b8.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2dc2ec575e\\x2d3678\\x2d4666\\x2d8d31\\x2ddf8dc94137b8.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2dc2ec575e\\x2d3678\\x2d4666\\x2d8d31\\x2ddf8dc94137b8.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 31 August 2024 18:42:00 -0400 (0:00:00.590) 0:06:28.724 ******* ok: [managed_node2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 31 August 2024 18:42:00 -0400 (0:00:00.060) 0:06:28.785 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 31 August 2024 18:42:01 -0400 (0:00:00.049) 0:06:28.834 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 31 August 2024 18:42:01 -0400 (0:00:00.039) 0:06:28.873 ******* TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 31 August 2024 18:42:01 -0400 (0:00:00.033) 0:06:28.906 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 31 August 2024 18:42:01 -0400 (0:00:00.693) 0:06:29.600 ******* ok: [managed_node2] => (item={u'src': u'/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 31 August 2024 18:42:02 -0400 (0:00:00.696) 0:06:30.296 ******* skipping: [managed_node2] => (item={u'src': u'/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 31 August 2024 18:42:02 -0400 (0:00:00.085) 0:06:30.382 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 31 August 2024 18:42:03 -0400 (0:00:00.585) 0:06:30.967 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144086.555769, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "74d60b407791a355d6e7435f9b12f5d75e1d0fb0", "ctime": 1725144084.4477317, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 917511, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1725144084.4467318, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "18446744071584696970", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 31 August 2024 18:42:03 -0400 (0:00:00.464) 0:06:31.432 ******* TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 31 August 2024 18:42:03 -0400 (0:00:00.092) 0:06:31.525 ******* ok: [managed_node2] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:391 Saturday 31 August 2024 18:42:04 -0400 (0:00:01.058) 0:06:32.583 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify role results] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:398 Saturday 31 August 2024 18:42:04 -0400 (0:00:00.141) 0:06:32.725 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 31 August 2024 18:42:05 -0400 (0:00:00.104) 0:06:32.830 ******* ok: [managed_node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 31 August 2024 18:42:05 -0400 (0:00:00.071) 0:06:32.902 ******* skipping: [managed_node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 31 August 2024 18:42:05 -0400 (0:00:00.064) 0:06:32.966 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "a4395052-edbc-405e-bf63-64a8c90ae727" }, "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "size": "4G", "type": "crypt", "uuid": "0cdc64a0-04e9-4033-bbf7-f44812397b4d" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "e3t0qk-oSfp-di6k-1NWD-tOt2-EOV5-cOetCf" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 31 August 2024 18:42:05 -0400 (0:00:00.434) 0:06:33.400 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003260", "end": "2024-08-31 18:42:06.072238", "rc": 0, "start": "2024-08-31 18:42:06.068978" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 31 August 2024 18:42:06 -0400 (0:00:00.566) 0:06:33.967 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003181", "end": "2024-08-31 18:42:06.552853", "failed_when_result": false, "rc": 0, "start": "2024-08-31 18:42:06.549672" } STDOUT: luks-a4395052-edbc-405e-bf63-64a8c90ae727 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 31 August 2024 18:42:06 -0400 (0:00:00.534) 0:06:34.501 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed_node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 31 August 2024 18:42:06 -0400 (0:00:00.142) 0:06:34.644 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 31 August 2024 18:42:06 -0400 (0:00:00.054) 0:06:34.698 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.022068", "end": "2024-08-31 18:42:07.219143", "rc": 0, "start": "2024-08-31 18:42:07.197075" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 31 August 2024 18:42:07 -0400 (0:00:00.424) 0:06:35.123 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 31 August 2024 18:42:07 -0400 (0:00:00.146) 0:06:35.269 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 31 August 2024 18:42:07 -0400 (0:00:00.127) 0:06:35.397 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 31 August 2024 18:42:07 -0400 (0:00:00.067) 0:06:35.464 ******* ok: [managed_node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 31 August 2024 18:42:08 -0400 (0:00:00.414) 0:06:35.879 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 31 August 2024 18:42:08 -0400 (0:00:00.066) 0:06:35.945 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 31 August 2024 18:42:08 -0400 (0:00:00.064) 0:06:36.009 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 31 August 2024 18:42:08 -0400 (0:00:00.073) 0:06:36.082 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 31 August 2024 18:42:08 -0400 (0:00:00.061) 0:06:36.144 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 31 August 2024 18:42:08 -0400 (0:00:00.060) 0:06:36.205 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 31 August 2024 18:42:08 -0400 (0:00:00.051) 0:06:36.257 ******* ok: [managed_node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 31 August 2024 18:42:08 -0400 (0:00:00.074) 0:06:36.332 ******* ok: [managed_node2] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.12.24 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 31 August 2024 18:42:08 -0400 (0:00:00.300) 0:06:36.632 ******* skipping: [managed_node2] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 31 August 2024 18:42:08 -0400 (0:00:00.062) 0:06:36.695 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed_node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 31 August 2024 18:42:08 -0400 (0:00:00.112) 0:06:36.808 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 31 August 2024 18:42:09 -0400 (0:00:00.067) 0:06:36.875 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 31 August 2024 18:42:09 -0400 (0:00:00.065) 0:06:36.940 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 31 August 2024 18:42:09 -0400 (0:00:00.055) 0:06:36.996 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 31 August 2024 18:42:09 -0400 (0:00:00.051) 0:06:37.048 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 31 August 2024 18:42:09 -0400 (0:00:00.052) 0:06:37.100 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 31 August 2024 18:42:09 -0400 (0:00:00.050) 0:06:37.150 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 31 August 2024 18:42:09 -0400 (0:00:00.060) 0:06:37.210 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 31 August 2024 18:42:09 -0400 (0:00:00.054) 0:06:37.264 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 31 August 2024 18:42:09 -0400 (0:00:00.052) 0:06:37.317 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 31 August 2024 18:42:09 -0400 (0:00:00.048) 0:06:37.365 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 31 August 2024 18:42:09 -0400 (0:00:00.053) 0:06:37.418 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed_node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 31 August 2024 18:42:09 -0400 (0:00:00.122) 0:06:37.541 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed_node2 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Saturday 31 August 2024 18:42:09 -0400 (0:00:00.115) 0:06:37.656 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Saturday 31 August 2024 18:42:09 -0400 (0:00:00.051) 0:06:37.708 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Saturday 31 August 2024 18:42:09 -0400 (0:00:00.054) 0:06:37.762 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Saturday 31 August 2024 18:42:09 -0400 (0:00:00.050) 0:06:37.812 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Saturday 31 August 2024 18:42:10 -0400 (0:00:00.045) 0:06:37.858 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Saturday 31 August 2024 18:42:10 -0400 (0:00:00.041) 0:06:37.899 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Saturday 31 August 2024 18:42:10 -0400 (0:00:00.052) 0:06:37.952 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 31 August 2024 18:42:10 -0400 (0:00:00.043) 0:06:37.995 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed_node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 31 August 2024 18:42:10 -0400 (0:00:00.081) 0:06:38.076 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed_node2 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Saturday 31 August 2024 18:42:10 -0400 (0:00:00.072) 0:06:38.149 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Saturday 31 August 2024 18:42:10 -0400 (0:00:00.035) 0:06:38.185 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Saturday 31 August 2024 18:42:10 -0400 (0:00:00.033) 0:06:38.218 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Saturday 31 August 2024 18:42:10 -0400 (0:00:00.031) 0:06:38.250 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 31 August 2024 18:42:10 -0400 (0:00:00.033) 0:06:38.283 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 31 August 2024 18:42:10 -0400 (0:00:00.078) 0:06:38.362 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 31 August 2024 18:42:10 -0400 (0:00:00.038) 0:06:38.401 ******* skipping: [managed_node2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 31 August 2024 18:42:10 -0400 (0:00:00.038) 0:06:38.440 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed_node2 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Saturday 31 August 2024 18:42:10 -0400 (0:00:00.064) 0:06:38.505 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Saturday 31 August 2024 18:42:10 -0400 (0:00:00.038) 0:06:38.543 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Saturday 31 August 2024 18:42:10 -0400 (0:00:00.093) 0:06:38.636 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Saturday 31 August 2024 18:42:10 -0400 (0:00:00.035) 0:06:38.671 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Saturday 31 August 2024 18:42:10 -0400 (0:00:00.033) 0:06:38.705 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Saturday 31 August 2024 18:42:10 -0400 (0:00:00.033) 0:06:38.738 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 31 August 2024 18:42:10 -0400 (0:00:00.034) 0:06:38.773 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 31 August 2024 18:42:10 -0400 (0:00:00.033) 0:06:38.806 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed_node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 31 August 2024 18:42:11 -0400 (0:00:00.080) 0:06:38.887 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed_node2 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Saturday 31 August 2024 18:42:11 -0400 (0:00:00.111) 0:06:38.998 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Saturday 31 August 2024 18:42:11 -0400 (0:00:00.059) 0:06:39.057 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Saturday 31 August 2024 18:42:11 -0400 (0:00:00.049) 0:06:39.107 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Saturday 31 August 2024 18:42:11 -0400 (0:00:00.070) 0:06:39.177 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Saturday 31 August 2024 18:42:11 -0400 (0:00:00.045) 0:06:39.223 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Saturday 31 August 2024 18:42:11 -0400 (0:00:00.045) 0:06:39.268 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Saturday 31 August 2024 18:42:11 -0400 (0:00:00.035) 0:06:39.304 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 31 August 2024 18:42:11 -0400 (0:00:00.037) 0:06:39.342 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed_node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 31 August 2024 18:42:11 -0400 (0:00:00.105) 0:06:39.448 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 31 August 2024 18:42:11 -0400 (0:00:00.051) 0:06:39.499 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 31 August 2024 18:42:11 -0400 (0:00:00.043) 0:06:39.543 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 31 August 2024 18:42:11 -0400 (0:00:00.056) 0:06:39.599 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 31 August 2024 18:42:11 -0400 (0:00:00.055) 0:06:39.655 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 31 August 2024 18:42:11 -0400 (0:00:00.052) 0:06:39.708 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 31 August 2024 18:42:11 -0400 (0:00:00.048) 0:06:39.757 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 31 August 2024 18:42:11 -0400 (0:00:00.044) 0:06:39.801 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed_node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 31 August 2024 18:42:12 -0400 (0:00:00.073) 0:06:39.874 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 31 August 2024 18:42:12 -0400 (0:00:00.040) 0:06:39.915 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed_node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 31 August 2024 18:42:12 -0400 (0:00:00.274) 0:06:40.190 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 31 August 2024 18:42:12 -0400 (0:00:00.061) 0:06:40.252 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 31 August 2024 18:42:12 -0400 (0:00:00.063) 0:06:40.316 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 31 August 2024 18:42:12 -0400 (0:00:00.046) 0:06:40.362 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 31 August 2024 18:42:12 -0400 (0:00:00.056) 0:06:40.419 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 31 August 2024 18:42:12 -0400 (0:00:00.055) 0:06:40.474 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 31 August 2024 18:42:12 -0400 (0:00:00.051) 0:06:40.526 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 31 August 2024 18:42:12 -0400 (0:00:00.059) 0:06:40.585 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 31 August 2024 18:42:12 -0400 (0:00:00.063) 0:06:40.648 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 31 August 2024 18:42:12 -0400 (0:00:00.054) 0:06:40.703 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 31 August 2024 18:42:12 -0400 (0:00:00.043) 0:06:40.747 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 31 August 2024 18:42:12 -0400 (0:00:00.043) 0:06:40.791 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 31 August 2024 18:42:13 -0400 (0:00:00.064) 0:06:40.855 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 31 August 2024 18:42:13 -0400 (0:00:00.042) 0:06:40.898 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 31 August 2024 18:42:13 -0400 (0:00:00.040) 0:06:40.938 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 31 August 2024 18:42:13 -0400 (0:00:00.033) 0:06:40.971 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 31 August 2024 18:42:13 -0400 (0:00:00.038) 0:06:41.010 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 31 August 2024 18:42:13 -0400 (0:00:00.035) 0:06:41.045 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 31 August 2024 18:42:13 -0400 (0:00:00.060) 0:06:41.105 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 31 August 2024 18:42:13 -0400 (0:00:00.086) 0:06:41.192 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144097.868968, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1725144080.4156609, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 226674, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1725144080.4156609, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 31 August 2024 18:42:13 -0400 (0:00:00.390) 0:06:41.582 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 31 August 2024 18:42:13 -0400 (0:00:00.054) 0:06:41.637 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 31 August 2024 18:42:13 -0400 (0:00:00.140) 0:06:41.778 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 31 August 2024 18:42:14 -0400 (0:00:00.052) 0:06:41.830 ******* ok: [managed_node2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 31 August 2024 18:42:14 -0400 (0:00:00.046) 0:06:41.877 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 31 August 2024 18:42:14 -0400 (0:00:00.047) 0:06:41.924 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 31 August 2024 18:42:14 -0400 (0:00:00.047) 0:06:41.972 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144080.652665, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1725144080.652665, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 226751, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1725144080.652665, "nlink": 1, "path": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 31 August 2024 18:42:14 -0400 (0:00:00.366) 0:06:42.338 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 31 August 2024 18:42:15 -0400 (0:00:00.711) 0:06:43.050 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.028339", "end": "2024-08-31 18:42:15.568827", "rc": 0, "start": "2024-08-31 18:42:15.540488" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 51 4c 12 ca b1 dc cf 4c a2 0e 73 50 3e b1 04 4c 98 9a 71 68 MK salt: 41 8b c8 4b 78 92 88 58 fd 30 8b 7a 56 09 11 07 af 0f 9a c9 35 7d 04 e8 05 91 bd 4c 65 77 d2 75 MK iterations: 20378 UUID: a4395052-edbc-405e-bf63-64a8c90ae727 Key Slot 0: ENABLED Iterations: 325240 Salt: 86 cf 2d d5 21 5a 24 1c 83 ae f6 bc cd b8 3e 0c 04 95 a8 82 85 eb 6a 3f 82 a1 37 a3 63 eb a9 5e Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 31 August 2024 18:42:15 -0400 (0:00:00.460) 0:06:43.511 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 31 August 2024 18:42:15 -0400 (0:00:00.067) 0:06:43.578 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 31 August 2024 18:42:15 -0400 (0:00:00.073) 0:06:43.652 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 31 August 2024 18:42:15 -0400 (0:00:00.073) 0:06:43.725 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 31 August 2024 18:42:15 -0400 (0:00:00.061) 0:06:43.787 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 31 August 2024 18:42:16 -0400 (0:00:00.059) 0:06:43.847 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 31 August 2024 18:42:16 -0400 (0:00:00.039) 0:06:43.886 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 31 August 2024 18:42:16 -0400 (0:00:00.042) 0:06:43.929 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-a4395052-edbc-405e-bf63-64a8c90ae727 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 31 August 2024 18:42:16 -0400 (0:00:00.068) 0:06:43.998 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 31 August 2024 18:42:16 -0400 (0:00:00.093) 0:06:44.092 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 31 August 2024 18:42:16 -0400 (0:00:00.075) 0:06:44.168 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 31 August 2024 18:42:16 -0400 (0:00:00.134) 0:06:44.303 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 31 August 2024 18:42:16 -0400 (0:00:00.128) 0:06:44.432 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 31 August 2024 18:42:16 -0400 (0:00:00.053) 0:06:44.485 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 31 August 2024 18:42:16 -0400 (0:00:00.056) 0:06:44.542 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 31 August 2024 18:42:16 -0400 (0:00:00.041) 0:06:44.583 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 31 August 2024 18:42:16 -0400 (0:00:00.037) 0:06:44.620 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 31 August 2024 18:42:16 -0400 (0:00:00.042) 0:06:44.662 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 31 August 2024 18:42:16 -0400 (0:00:00.063) 0:06:44.726 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 31 August 2024 18:42:16 -0400 (0:00:00.051) 0:06:44.778 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 31 August 2024 18:42:17 -0400 (0:00:00.061) 0:06:44.839 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 31 August 2024 18:42:17 -0400 (0:00:00.056) 0:06:44.895 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 31 August 2024 18:42:17 -0400 (0:00:00.066) 0:06:44.961 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 31 August 2024 18:42:17 -0400 (0:00:00.042) 0:06:45.004 ******* ok: [managed_node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 31 August 2024 18:42:17 -0400 (0:00:00.406) 0:06:45.410 ******* ok: [managed_node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 31 August 2024 18:42:17 -0400 (0:00:00.384) 0:06:45.794 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 31 August 2024 18:42:18 -0400 (0:00:00.070) 0:06:45.865 ******* ok: [managed_node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 31 August 2024 18:42:18 -0400 (0:00:00.070) 0:06:45.935 ******* ok: [managed_node2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 31 August 2024 18:42:18 -0400 (0:00:00.408) 0:06:46.344 ******* skipping: [managed_node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 31 August 2024 18:42:18 -0400 (0:00:00.083) 0:06:46.427 ******* skipping: [managed_node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 31 August 2024 18:42:18 -0400 (0:00:00.092) 0:06:46.520 ******* skipping: [managed_node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 31 August 2024 18:42:18 -0400 (0:00:00.065) 0:06:46.585 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 31 August 2024 18:42:18 -0400 (0:00:00.079) 0:06:46.664 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 31 August 2024 18:42:18 -0400 (0:00:00.065) 0:06:46.730 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 31 August 2024 18:42:18 -0400 (0:00:00.065) 0:06:46.796 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 31 August 2024 18:42:19 -0400 (0:00:00.055) 0:06:46.851 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 31 August 2024 18:42:19 -0400 (0:00:00.058) 0:06:46.910 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 31 August 2024 18:42:19 -0400 (0:00:00.055) 0:06:46.965 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 31 August 2024 18:42:19 -0400 (0:00:00.042) 0:06:47.008 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 31 August 2024 18:42:19 -0400 (0:00:00.039) 0:06:47.048 ******* skipping: [managed_node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 31 August 2024 18:42:19 -0400 (0:00:00.037) 0:06:47.086 ******* skipping: [managed_node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 31 August 2024 18:42:19 -0400 (0:00:00.044) 0:06:47.130 ******* skipping: [managed_node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 31 August 2024 18:42:19 -0400 (0:00:00.034) 0:06:47.165 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 31 August 2024 18:42:19 -0400 (0:00:00.053) 0:06:47.218 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 31 August 2024 18:42:19 -0400 (0:00:00.088) 0:06:47.307 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 31 August 2024 18:42:19 -0400 (0:00:00.084) 0:06:47.391 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 31 August 2024 18:42:19 -0400 (0:00:00.111) 0:06:47.502 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 31 August 2024 18:42:19 -0400 (0:00:00.092) 0:06:47.594 ******* ok: [managed_node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 31 August 2024 18:42:19 -0400 (0:00:00.093) 0:06:47.688 ******* ok: [managed_node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 31 August 2024 18:42:19 -0400 (0:00:00.067) 0:06:47.755 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 31 August 2024 18:42:20 -0400 (0:00:00.085) 0:06:47.841 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.021908", "end": "2024-08-31 18:42:20.369969", "rc": 0, "start": "2024-08-31 18:42:20.348061" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 31 August 2024 18:42:20 -0400 (0:00:00.442) 0:06:48.283 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 31 August 2024 18:42:20 -0400 (0:00:00.076) 0:06:48.359 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 31 August 2024 18:42:20 -0400 (0:00:00.162) 0:06:48.522 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 31 August 2024 18:42:20 -0400 (0:00:00.085) 0:06:48.608 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 31 August 2024 18:42:20 -0400 (0:00:00.097) 0:06:48.706 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 31 August 2024 18:42:20 -0400 (0:00:00.059) 0:06:48.766 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 31 August 2024 18:42:21 -0400 (0:00:00.076) 0:06:48.842 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 31 August 2024 18:42:21 -0400 (0:00:00.082) 0:06:48.925 ******* TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 31 August 2024 18:42:21 -0400 (0:00:00.053) 0:06:48.979 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 31 August 2024 18:42:21 -0400 (0:00:00.054) 0:06:49.033 ******* changed: [managed_node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:404 Saturday 31 August 2024 18:42:21 -0400 (0:00:00.592) 0:06:49.626 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed_node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 31 August 2024 18:42:22 -0400 (0:00:00.229) 0:06:49.855 ******* ok: [managed_node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 31 August 2024 18:42:22 -0400 (0:00:00.080) 0:06:49.936 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 31 August 2024 18:42:22 -0400 (0:00:00.135) 0:06:50.072 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 31 August 2024 18:42:22 -0400 (0:00:00.143) 0:06:50.215 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 31 August 2024 18:42:22 -0400 (0:00:00.161) 0:06:50.376 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed_node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 31 August 2024 18:42:22 -0400 (0:00:00.192) 0:06:50.569 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 31 August 2024 18:42:22 -0400 (0:00:00.087) 0:06:50.656 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 31 August 2024 18:42:22 -0400 (0:00:00.076) 0:06:50.732 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 31 August 2024 18:42:23 -0400 (0:00:00.096) 0:06:50.829 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 31 August 2024 18:42:23 -0400 (0:00:00.070) 0:06:50.900 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 31 August 2024 18:42:23 -0400 (0:00:00.142) 0:06:51.043 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 31 August 2024 18:42:24 -0400 (0:00:01.509) 0:06:52.552 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 31 August 2024 18:42:24 -0400 (0:00:00.068) 0:06:52.620 ******* ok: [managed_node2] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 31 August 2024 18:42:24 -0400 (0:00:00.067) 0:06:52.688 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 31 August 2024 18:42:29 -0400 (0:00:04.190) 0:06:56.878 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 31 August 2024 18:42:29 -0400 (0:00:00.085) 0:06:56.964 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 31 August 2024 18:42:29 -0400 (0:00:00.035) 0:06:56.999 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 31 August 2024 18:42:29 -0400 (0:00:00.038) 0:06:57.038 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 31 August 2024 18:42:29 -0400 (0:00:00.042) 0:06:57.081 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 31 August 2024 18:42:30 -0400 (0:00:00.796) 0:06:57.878 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service": { "name": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 31 August 2024 18:42:31 -0400 (0:00:01.145) 0:06:59.024 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 31 August 2024 18:42:31 -0400 (0:00:00.087) 0:06:59.111 ******* changed: [managed_node2] => (item=systemd-cryptsetup@luks\x2da4395052\x2dedbc\x2d405e\x2dbf63\x2d64a8c90ae727.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "name": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target systemd-journald.socket systemd-readahead-replay.service systemd-readahead-collect.service dev-mapper-foo\\x2dtest1.device", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-a4395052-edbc-405e-bf63-64a8c90ae727", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-a4395052-edbc-405e-bf63-64a8c90ae727 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-a4395052-edbc-405e-bf63-64a8c90ae727 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "dev-mapper-luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 31 August 2024 18:42:32 -0400 (0:00:00.761) 0:06:59.872 ******* fatal: [managed_node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-a4395052-edbc-405e-bf63-64a8c90ae727' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Saturday 31 August 2024 18:42:36 -0400 (0:00:04.700) 0:07:04.572 ******* fatal: [managed_node2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'lvm', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-a4395052-edbc-405e-bf63-64a8c90ae727' in safe mode due to encryption removal"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 31 August 2024 18:42:36 -0400 (0:00:00.082) 0:07:04.655 ******* changed: [managed_node2] => (item=systemd-cryptsetup@luks\x2da4395052\x2dedbc\x2d405e\x2dbf63\x2d64a8c90ae727.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "name": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "dev-mapper-luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 31 August 2024 18:42:37 -0400 (0:00:00.576) 0:07:05.231 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 31 August 2024 18:42:37 -0400 (0:00:00.042) 0:07:05.274 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 31 August 2024 18:42:37 -0400 (0:00:00.051) 0:07:05.325 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 31 August 2024 18:42:37 -0400 (0:00:00.055) 0:07:05.381 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144141.6447392, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1725144141.6447392, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1725144141.6447392, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "407817110", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 31 August 2024 18:42:38 -0400 (0:00:00.456) 0:07:05.837 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:427 Saturday 31 August 2024 18:42:38 -0400 (0:00:00.069) 0:07:05.906 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 31 August 2024 18:42:38 -0400 (0:00:00.135) 0:07:06.041 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 31 August 2024 18:42:38 -0400 (0:00:00.063) 0:07:06.105 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 31 August 2024 18:42:38 -0400 (0:00:00.054) 0:07:06.159 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed_node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 31 August 2024 18:42:38 -0400 (0:00:00.114) 0:07:06.274 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 31 August 2024 18:42:38 -0400 (0:00:00.051) 0:07:06.325 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 31 August 2024 18:42:38 -0400 (0:00:00.059) 0:07:06.384 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 31 August 2024 18:42:38 -0400 (0:00:00.084) 0:07:06.469 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 31 August 2024 18:42:38 -0400 (0:00:00.054) 0:07:06.524 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 31 August 2024 18:42:38 -0400 (0:00:00.100) 0:07:06.625 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 31 August 2024 18:42:40 -0400 (0:00:01.554) 0:07:08.180 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 31 August 2024 18:42:40 -0400 (0:00:00.041) 0:07:08.222 ******* ok: [managed_node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 31 August 2024 18:42:40 -0400 (0:00:00.037) 0:07:08.259 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 31 August 2024 18:42:44 -0400 (0:00:04.461) 0:07:12.721 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 31 August 2024 18:42:45 -0400 (0:00:00.103) 0:07:12.824 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 31 August 2024 18:42:45 -0400 (0:00:00.055) 0:07:12.880 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 31 August 2024 18:42:45 -0400 (0:00:00.051) 0:07:12.931 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 31 August 2024 18:42:45 -0400 (0:00:00.051) 0:07:12.983 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 31 August 2024 18:42:45 -0400 (0:00:00.773) 0:07:13.757 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service": { "name": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 31 August 2024 18:42:47 -0400 (0:00:01.189) 0:07:14.947 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 31 August 2024 18:42:47 -0400 (0:00:00.076) 0:07:15.023 ******* changed: [managed_node2] => (item=systemd-cryptsetup@luks\x2da4395052\x2dedbc\x2d405e\x2dbf63\x2d64a8c90ae727.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "name": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-collect.service system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target systemd-journald.socket dev-mapper-foo\\x2dtest1.device systemd-readahead-replay.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-a4395052-edbc-405e-bf63-64a8c90ae727", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-a4395052-edbc-405e-bf63-64a8c90ae727 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-a4395052-edbc-405e-bf63-64a8c90ae727 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 31 August 2024 18:42:47 -0400 (0:00:00.655) 0:07:15.679 ******* changed: [managed_node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-a4395052-edbc-405e-bf63-64a8c90ae727", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 31 August 2024 18:43:53 -0400 (0:01:05.172) 0:08:20.852 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 31 August 2024 18:43:53 -0400 (0:00:00.066) 0:08:20.918 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144082.871704, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "39ffbb971a993003597f82eb722c040922dc1098", "ctime": 1725144082.8687038, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263550, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1725144082.8687038, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744071584693682", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 31 August 2024 18:43:53 -0400 (0:00:00.605) 0:08:21.524 ******* ok: [managed_node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 31 August 2024 18:43:54 -0400 (0:00:00.647) 0:08:22.172 ******* changed: [managed_node2] => (item=systemd-cryptsetup@luks\x2da4395052\x2dedbc\x2d405e\x2dbf63\x2d64a8c90ae727.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "name": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "dev-mapper-luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 31 August 2024 18:43:55 -0400 (0:00:00.831) 0:08:23.003 ******* ok: [managed_node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-a4395052-edbc-405e-bf63-64a8c90ae727", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 31 August 2024 18:43:55 -0400 (0:00:00.073) 0:08:23.077 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 31 August 2024 18:43:55 -0400 (0:00:00.105) 0:08:23.182 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 31 August 2024 18:43:55 -0400 (0:00:00.064) 0:08:23.246 ******* changed: [managed_node2] => (item={u'src': u'/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-a4395052-edbc-405e-bf63-64a8c90ae727" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 31 August 2024 18:43:55 -0400 (0:00:00.438) 0:08:23.684 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 31 August 2024 18:43:56 -0400 (0:00:00.571) 0:08:24.255 ******* changed: [managed_node2] => (item={u'src': u'/dev/mapper/foo-test1', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 31 August 2024 18:43:56 -0400 (0:00:00.398) 0:08:24.654 ******* skipping: [managed_node2] => (item={u'src': u'/dev/mapper/foo-test1', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 31 August 2024 18:43:56 -0400 (0:00:00.070) 0:08:24.725 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 31 August 2024 18:43:57 -0400 (0:00:00.550) 0:08:25.275 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144086.555769, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "74d60b407791a355d6e7435f9b12f5d75e1d0fb0", "ctime": 1725144084.4477317, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 917511, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1725144084.4467318, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "18446744071584696970", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 31 August 2024 18:43:57 -0400 (0:00:00.367) 0:08:25.643 ******* changed: [managed_node2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-a4395052-edbc-405e-bf63-64a8c90ae727', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-a4395052-edbc-405e-bf63-64a8c90ae727", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 31 August 2024 18:43:58 -0400 (0:00:00.408) 0:08:26.051 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:443 Saturday 31 August 2024 18:43:59 -0400 (0:00:00.802) 0:08:26.853 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 31 August 2024 18:43:59 -0400 (0:00:00.116) 0:08:26.969 ******* ok: [managed_node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 31 August 2024 18:43:59 -0400 (0:00:00.056) 0:08:27.026 ******* skipping: [managed_node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 31 August 2024 18:43:59 -0400 (0:00:00.040) 0:08:27.067 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "161567c5-bbf6-423f-9255-b480fce6fb9f" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "e3t0qk-oSfp-di6k-1NWD-tOt2-EOV5-cOetCf" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 31 August 2024 18:43:59 -0400 (0:00:00.452) 0:08:27.519 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003223", "end": "2024-08-31 18:44:00.202761", "rc": 0, "start": "2024-08-31 18:44:00.199538" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 31 August 2024 18:44:00 -0400 (0:00:00.677) 0:08:28.197 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003221", "end": "2024-08-31 18:44:00.864487", "failed_when_result": false, "rc": 0, "start": "2024-08-31 18:44:00.861266" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 31 August 2024 18:44:01 -0400 (0:00:00.652) 0:08:28.849 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed_node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 31 August 2024 18:44:01 -0400 (0:00:00.134) 0:08:28.983 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 31 August 2024 18:44:01 -0400 (0:00:00.056) 0:08:29.039 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.021304", "end": "2024-08-31 18:44:01.585095", "rc": 0, "start": "2024-08-31 18:44:01.563791" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 31 August 2024 18:44:01 -0400 (0:00:00.479) 0:08:29.518 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 31 August 2024 18:44:01 -0400 (0:00:00.070) 0:08:29.589 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 31 August 2024 18:44:01 -0400 (0:00:00.167) 0:08:29.757 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 31 August 2024 18:44:02 -0400 (0:00:00.103) 0:08:29.860 ******* ok: [managed_node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 31 August 2024 18:44:02 -0400 (0:00:00.682) 0:08:30.543 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 31 August 2024 18:44:02 -0400 (0:00:00.117) 0:08:30.661 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 31 August 2024 18:44:02 -0400 (0:00:00.094) 0:08:30.756 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 31 August 2024 18:44:03 -0400 (0:00:00.075) 0:08:30.831 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 31 August 2024 18:44:03 -0400 (0:00:00.064) 0:08:30.895 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 31 August 2024 18:44:03 -0400 (0:00:00.068) 0:08:30.964 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 31 August 2024 18:44:03 -0400 (0:00:00.091) 0:08:31.056 ******* ok: [managed_node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 31 August 2024 18:44:03 -0400 (0:00:00.152) 0:08:31.209 ******* ok: [managed_node2] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.12.24 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 31 August 2024 18:44:03 -0400 (0:00:00.515) 0:08:31.724 ******* skipping: [managed_node2] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 31 August 2024 18:44:03 -0400 (0:00:00.084) 0:08:31.808 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed_node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 31 August 2024 18:44:04 -0400 (0:00:00.125) 0:08:31.934 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 31 August 2024 18:44:04 -0400 (0:00:00.057) 0:08:31.992 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 31 August 2024 18:44:04 -0400 (0:00:00.076) 0:08:32.068 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 31 August 2024 18:44:04 -0400 (0:00:00.062) 0:08:32.131 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 31 August 2024 18:44:04 -0400 (0:00:00.051) 0:08:32.182 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 31 August 2024 18:44:04 -0400 (0:00:00.052) 0:08:32.234 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 31 August 2024 18:44:04 -0400 (0:00:00.054) 0:08:32.289 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 31 August 2024 18:44:04 -0400 (0:00:00.051) 0:08:32.341 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 31 August 2024 18:44:04 -0400 (0:00:00.057) 0:08:32.398 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 31 August 2024 18:44:04 -0400 (0:00:00.055) 0:08:32.454 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 31 August 2024 18:44:04 -0400 (0:00:00.077) 0:08:32.532 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 31 August 2024 18:44:04 -0400 (0:00:00.056) 0:08:32.588 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed_node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 31 August 2024 18:44:04 -0400 (0:00:00.099) 0:08:32.688 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed_node2 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Saturday 31 August 2024 18:44:04 -0400 (0:00:00.122) 0:08:32.810 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Saturday 31 August 2024 18:44:05 -0400 (0:00:00.038) 0:08:32.849 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Saturday 31 August 2024 18:44:05 -0400 (0:00:00.035) 0:08:32.884 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Saturday 31 August 2024 18:44:05 -0400 (0:00:00.035) 0:08:32.919 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Saturday 31 August 2024 18:44:05 -0400 (0:00:00.046) 0:08:32.966 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Saturday 31 August 2024 18:44:05 -0400 (0:00:00.056) 0:08:33.023 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Saturday 31 August 2024 18:44:05 -0400 (0:00:00.056) 0:08:33.080 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 31 August 2024 18:44:05 -0400 (0:00:00.072) 0:08:33.152 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed_node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 31 August 2024 18:44:05 -0400 (0:00:00.166) 0:08:33.318 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed_node2 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Saturday 31 August 2024 18:44:05 -0400 (0:00:00.116) 0:08:33.435 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Saturday 31 August 2024 18:44:05 -0400 (0:00:00.079) 0:08:33.515 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Saturday 31 August 2024 18:44:05 -0400 (0:00:00.089) 0:08:33.604 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Saturday 31 August 2024 18:44:05 -0400 (0:00:00.055) 0:08:33.660 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 31 August 2024 18:44:05 -0400 (0:00:00.060) 0:08:33.720 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 31 August 2024 18:44:06 -0400 (0:00:00.156) 0:08:33.877 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 31 August 2024 18:44:06 -0400 (0:00:00.064) 0:08:33.941 ******* skipping: [managed_node2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 31 August 2024 18:44:06 -0400 (0:00:00.064) 0:08:34.006 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed_node2 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Saturday 31 August 2024 18:44:06 -0400 (0:00:00.133) 0:08:34.140 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Saturday 31 August 2024 18:44:06 -0400 (0:00:00.060) 0:08:34.200 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Saturday 31 August 2024 18:44:06 -0400 (0:00:00.085) 0:08:34.286 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Saturday 31 August 2024 18:44:06 -0400 (0:00:00.091) 0:08:34.377 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Saturday 31 August 2024 18:44:06 -0400 (0:00:00.080) 0:08:34.457 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Saturday 31 August 2024 18:44:06 -0400 (0:00:00.244) 0:08:34.702 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 31 August 2024 18:44:06 -0400 (0:00:00.079) 0:08:34.782 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 31 August 2024 18:44:07 -0400 (0:00:00.072) 0:08:34.855 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed_node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 31 August 2024 18:44:07 -0400 (0:00:00.164) 0:08:35.019 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed_node2 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Saturday 31 August 2024 18:44:07 -0400 (0:00:00.137) 0:08:35.157 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Saturday 31 August 2024 18:44:07 -0400 (0:00:00.052) 0:08:35.210 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Saturday 31 August 2024 18:44:07 -0400 (0:00:00.046) 0:08:35.257 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Saturday 31 August 2024 18:44:07 -0400 (0:00:00.041) 0:08:35.299 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Saturday 31 August 2024 18:44:07 -0400 (0:00:00.038) 0:08:35.337 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Saturday 31 August 2024 18:44:07 -0400 (0:00:00.035) 0:08:35.373 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Saturday 31 August 2024 18:44:07 -0400 (0:00:00.042) 0:08:35.415 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 31 August 2024 18:44:07 -0400 (0:00:00.066) 0:08:35.482 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed_node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 31 August 2024 18:44:07 -0400 (0:00:00.148) 0:08:35.631 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 31 August 2024 18:44:07 -0400 (0:00:00.090) 0:08:35.722 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 31 August 2024 18:44:07 -0400 (0:00:00.080) 0:08:35.803 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 31 August 2024 18:44:08 -0400 (0:00:00.054) 0:08:35.857 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 31 August 2024 18:44:08 -0400 (0:00:00.059) 0:08:35.917 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 31 August 2024 18:44:08 -0400 (0:00:00.044) 0:08:35.962 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 31 August 2024 18:44:08 -0400 (0:00:00.079) 0:08:36.041 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 31 August 2024 18:44:08 -0400 (0:00:00.050) 0:08:36.092 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed_node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 31 August 2024 18:44:08 -0400 (0:00:00.070) 0:08:36.162 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 31 August 2024 18:44:08 -0400 (0:00:00.049) 0:08:36.211 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed_node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 31 August 2024 18:44:08 -0400 (0:00:00.276) 0:08:36.487 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 31 August 2024 18:44:08 -0400 (0:00:00.064) 0:08:36.552 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 31 August 2024 18:44:08 -0400 (0:00:00.065) 0:08:36.617 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 31 August 2024 18:44:08 -0400 (0:00:00.062) 0:08:36.680 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 31 August 2024 18:44:08 -0400 (0:00:00.086) 0:08:36.766 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 31 August 2024 18:44:09 -0400 (0:00:00.133) 0:08:36.900 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 31 August 2024 18:44:09 -0400 (0:00:00.094) 0:08:36.994 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 31 August 2024 18:44:09 -0400 (0:00:00.090) 0:08:37.085 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 31 August 2024 18:44:09 -0400 (0:00:00.062) 0:08:37.148 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 31 August 2024 18:44:09 -0400 (0:00:00.061) 0:08:37.209 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 31 August 2024 18:44:09 -0400 (0:00:00.068) 0:08:37.278 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 31 August 2024 18:44:09 -0400 (0:00:00.067) 0:08:37.346 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 31 August 2024 18:44:09 -0400 (0:00:00.096) 0:08:37.443 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 31 August 2024 18:44:09 -0400 (0:00:00.062) 0:08:37.505 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 31 August 2024 18:44:09 -0400 (0:00:00.060) 0:08:37.566 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 31 August 2024 18:44:09 -0400 (0:00:00.060) 0:08:37.626 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 31 August 2024 18:44:09 -0400 (0:00:00.059) 0:08:37.686 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 31 August 2024 18:44:09 -0400 (0:00:00.078) 0:08:37.764 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 31 August 2024 18:44:10 -0400 (0:00:00.094) 0:08:37.859 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 31 August 2024 18:44:10 -0400 (0:00:00.080) 0:08:37.940 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144232.8953438, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1725144232.8953438, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 247755, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1725144232.8953438, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 31 August 2024 18:44:10 -0400 (0:00:00.434) 0:08:38.374 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 31 August 2024 18:44:10 -0400 (0:00:00.073) 0:08:38.447 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 31 August 2024 18:44:10 -0400 (0:00:00.059) 0:08:38.506 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 31 August 2024 18:44:10 -0400 (0:00:00.066) 0:08:38.573 ******* ok: [managed_node2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 31 August 2024 18:44:10 -0400 (0:00:00.064) 0:08:38.638 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 31 August 2024 18:44:10 -0400 (0:00:00.065) 0:08:38.703 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 31 August 2024 18:44:10 -0400 (0:00:00.067) 0:08:38.771 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 31 August 2024 18:44:11 -0400 (0:00:00.067) 0:08:38.839 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 31 August 2024 18:44:11 -0400 (0:00:00.936) 0:08:39.776 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 31 August 2024 18:44:12 -0400 (0:00:00.056) 0:08:39.832 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 31 August 2024 18:44:12 -0400 (0:00:00.057) 0:08:39.890 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 31 August 2024 18:44:12 -0400 (0:00:00.060) 0:08:39.950 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 31 August 2024 18:44:12 -0400 (0:00:00.043) 0:08:39.993 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 31 August 2024 18:44:12 -0400 (0:00:00.044) 0:08:40.037 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 31 August 2024 18:44:12 -0400 (0:00:00.042) 0:08:40.080 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 31 August 2024 18:44:12 -0400 (0:00:00.033) 0:08:40.114 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 31 August 2024 18:44:12 -0400 (0:00:00.042) 0:08:40.156 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 31 August 2024 18:44:12 -0400 (0:00:00.057) 0:08:40.213 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 31 August 2024 18:44:12 -0400 (0:00:00.090) 0:08:40.304 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 31 August 2024 18:44:12 -0400 (0:00:00.052) 0:08:40.357 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 31 August 2024 18:44:12 -0400 (0:00:00.059) 0:08:40.416 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 31 August 2024 18:44:12 -0400 (0:00:00.054) 0:08:40.471 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 31 August 2024 18:44:12 -0400 (0:00:00.059) 0:08:40.530 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 31 August 2024 18:44:12 -0400 (0:00:00.053) 0:08:40.584 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 31 August 2024 18:44:12 -0400 (0:00:00.040) 0:08:40.625 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 31 August 2024 18:44:12 -0400 (0:00:00.041) 0:08:40.666 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 31 August 2024 18:44:12 -0400 (0:00:00.040) 0:08:40.707 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 31 August 2024 18:44:12 -0400 (0:00:00.038) 0:08:40.745 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 31 August 2024 18:44:12 -0400 (0:00:00.042) 0:08:40.788 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 31 August 2024 18:44:13 -0400 (0:00:00.033) 0:08:40.822 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 31 August 2024 18:44:13 -0400 (0:00:00.033) 0:08:40.855 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 31 August 2024 18:44:13 -0400 (0:00:00.033) 0:08:40.888 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 31 August 2024 18:44:13 -0400 (0:00:00.034) 0:08:40.923 ******* ok: [managed_node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 31 August 2024 18:44:13 -0400 (0:00:00.402) 0:08:41.326 ******* ok: [managed_node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 31 August 2024 18:44:14 -0400 (0:00:00.493) 0:08:41.819 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 31 August 2024 18:44:14 -0400 (0:00:00.070) 0:08:41.890 ******* ok: [managed_node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 31 August 2024 18:44:14 -0400 (0:00:00.058) 0:08:41.948 ******* ok: [managed_node2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 31 August 2024 18:44:14 -0400 (0:00:00.378) 0:08:42.327 ******* skipping: [managed_node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 31 August 2024 18:44:14 -0400 (0:00:00.053) 0:08:42.381 ******* skipping: [managed_node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 31 August 2024 18:44:14 -0400 (0:00:00.078) 0:08:42.459 ******* skipping: [managed_node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 31 August 2024 18:44:14 -0400 (0:00:00.058) 0:08:42.517 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 31 August 2024 18:44:14 -0400 (0:00:00.058) 0:08:42.576 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 31 August 2024 18:44:14 -0400 (0:00:00.053) 0:08:42.630 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 31 August 2024 18:44:14 -0400 (0:00:00.057) 0:08:42.687 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 31 August 2024 18:44:14 -0400 (0:00:00.059) 0:08:42.746 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 31 August 2024 18:44:14 -0400 (0:00:00.057) 0:08:42.804 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 31 August 2024 18:44:15 -0400 (0:00:00.049) 0:08:42.854 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 31 August 2024 18:44:15 -0400 (0:00:00.049) 0:08:42.903 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 31 August 2024 18:44:15 -0400 (0:00:00.051) 0:08:42.955 ******* skipping: [managed_node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 31 August 2024 18:44:15 -0400 (0:00:00.054) 0:08:43.009 ******* skipping: [managed_node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 31 August 2024 18:44:15 -0400 (0:00:00.050) 0:08:43.060 ******* skipping: [managed_node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 31 August 2024 18:44:15 -0400 (0:00:00.050) 0:08:43.111 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 31 August 2024 18:44:15 -0400 (0:00:00.053) 0:08:43.165 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 31 August 2024 18:44:15 -0400 (0:00:00.053) 0:08:43.218 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 31 August 2024 18:44:15 -0400 (0:00:00.053) 0:08:43.272 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 31 August 2024 18:44:15 -0400 (0:00:00.058) 0:08:43.330 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 31 August 2024 18:44:15 -0400 (0:00:00.052) 0:08:43.383 ******* ok: [managed_node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 31 August 2024 18:44:15 -0400 (0:00:00.058) 0:08:43.442 ******* ok: [managed_node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 31 August 2024 18:44:15 -0400 (0:00:00.084) 0:08:43.526 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 31 August 2024 18:44:15 -0400 (0:00:00.061) 0:08:43.588 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.021092", "end": "2024-08-31 18:44:16.079650", "rc": 0, "start": "2024-08-31 18:44:16.058558" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 31 August 2024 18:44:16 -0400 (0:00:00.372) 0:08:43.960 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 31 August 2024 18:44:16 -0400 (0:00:00.047) 0:08:44.008 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 31 August 2024 18:44:16 -0400 (0:00:00.048) 0:08:44.056 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 31 August 2024 18:44:16 -0400 (0:00:00.042) 0:08:44.099 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 31 August 2024 18:44:16 -0400 (0:00:00.108) 0:08:44.207 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 31 August 2024 18:44:16 -0400 (0:00:00.041) 0:08:44.248 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 31 August 2024 18:44:16 -0400 (0:00:00.038) 0:08:44.287 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 31 August 2024 18:44:16 -0400 (0:00:00.034) 0:08:44.322 ******* TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 31 August 2024 18:44:16 -0400 (0:00:00.031) 0:08:44.354 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 31 August 2024 18:44:16 -0400 (0:00:00.035) 0:08:44.389 ******* changed: [managed_node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:449 Saturday 31 August 2024 18:44:16 -0400 (0:00:00.344) 0:08:44.734 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed_node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 31 August 2024 18:44:16 -0400 (0:00:00.075) 0:08:44.809 ******* ok: [managed_node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 31 August 2024 18:44:17 -0400 (0:00:00.042) 0:08:44.852 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 31 August 2024 18:44:17 -0400 (0:00:00.052) 0:08:44.904 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 31 August 2024 18:44:17 -0400 (0:00:00.057) 0:08:44.962 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 31 August 2024 18:44:17 -0400 (0:00:00.043) 0:08:45.006 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed_node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 31 August 2024 18:44:17 -0400 (0:00:00.086) 0:08:45.093 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 31 August 2024 18:44:17 -0400 (0:00:00.033) 0:08:45.126 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 31 August 2024 18:44:17 -0400 (0:00:00.036) 0:08:45.162 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 31 August 2024 18:44:17 -0400 (0:00:00.035) 0:08:45.198 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 31 August 2024 18:44:17 -0400 (0:00:00.034) 0:08:45.233 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 31 August 2024 18:44:17 -0400 (0:00:00.082) 0:08:45.315 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 31 August 2024 18:44:18 -0400 (0:00:01.219) 0:08:46.534 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 31 August 2024 18:44:18 -0400 (0:00:00.045) 0:08:46.580 ******* ok: [managed_node2] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 31 August 2024 18:44:18 -0400 (0:00:00.054) 0:08:46.634 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 31 August 2024 18:44:22 -0400 (0:00:04.128) 0:08:50.763 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 31 August 2024 18:44:23 -0400 (0:00:00.087) 0:08:50.851 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 31 August 2024 18:44:23 -0400 (0:00:00.036) 0:08:50.888 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 31 August 2024 18:44:23 -0400 (0:00:00.034) 0:08:50.922 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 31 August 2024 18:44:23 -0400 (0:00:00.030) 0:08:50.952 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 31 August 2024 18:44:23 -0400 (0:00:00.664) 0:08:51.617 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service": { "name": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 31 August 2024 18:44:24 -0400 (0:00:01.147) 0:08:52.765 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 31 August 2024 18:44:25 -0400 (0:00:00.101) 0:08:52.866 ******* changed: [managed_node2] => (item=systemd-cryptsetup@luks\x2da4395052\x2dedbc\x2d405e\x2dbf63\x2d64a8c90ae727.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "name": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-collect.service cryptsetup-pre.target dev-mapper-foo\\x2dtest1.device systemd-journald.socket systemd-readahead-replay.service system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-a4395052-edbc-405e-bf63-64a8c90ae727", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-a4395052-edbc-405e-bf63-64a8c90ae727 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-a4395052-edbc-405e-bf63-64a8c90ae727 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 31 August 2024 18:44:25 -0400 (0:00:00.682) 0:08:53.549 ******* fatal: [managed_node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Saturday 31 August 2024 18:44:30 -0400 (0:00:04.453) 0:08:58.002 ******* fatal: [managed_node2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'lvm', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'test1' in safe mode due to adding encryption"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 31 August 2024 18:44:30 -0400 (0:00:00.058) 0:08:58.061 ******* changed: [managed_node2] => (item=systemd-cryptsetup@luks\x2da4395052\x2dedbc\x2d405e\x2dbf63\x2d64a8c90ae727.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "name": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2da4395052\\x2dedbc\\x2d405e\\x2dbf63\\x2d64a8c90ae727.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 31 August 2024 18:44:30 -0400 (0:00:00.545) 0:08:58.607 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 31 August 2024 18:44:30 -0400 (0:00:00.048) 0:08:58.656 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 31 August 2024 18:44:30 -0400 (0:00:00.054) 0:08:58.711 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 31 August 2024 18:44:30 -0400 (0:00:00.038) 0:08:58.750 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144256.854766, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1725144256.854766, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1725144256.854766, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744073016935354", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 31 August 2024 18:44:31 -0400 (0:00:00.387) 0:08:59.137 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:472 Saturday 31 August 2024 18:44:31 -0400 (0:00:00.080) 0:08:59.218 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 31 August 2024 18:44:31 -0400 (0:00:00.286) 0:08:59.505 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 31 August 2024 18:44:31 -0400 (0:00:00.096) 0:08:59.601 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 31 August 2024 18:44:31 -0400 (0:00:00.066) 0:08:59.668 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed_node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 31 August 2024 18:44:31 -0400 (0:00:00.136) 0:08:59.805 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 31 August 2024 18:44:32 -0400 (0:00:00.053) 0:08:59.858 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 31 August 2024 18:44:32 -0400 (0:00:00.051) 0:08:59.909 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 31 August 2024 18:44:32 -0400 (0:00:00.063) 0:08:59.973 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 31 August 2024 18:44:32 -0400 (0:00:00.097) 0:09:00.070 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 31 August 2024 18:44:32 -0400 (0:00:00.177) 0:09:00.247 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 31 August 2024 18:44:33 -0400 (0:00:01.450) 0:09:01.698 ******* ok: [managed_node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 31 August 2024 18:44:33 -0400 (0:00:00.065) 0:09:01.764 ******* ok: [managed_node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 31 August 2024 18:44:34 -0400 (0:00:00.074) 0:09:01.838 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 31 August 2024 18:44:38 -0400 (0:00:04.295) 0:09:06.134 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 31 August 2024 18:44:38 -0400 (0:00:00.131) 0:09:06.266 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 31 August 2024 18:44:38 -0400 (0:00:00.066) 0:09:06.332 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 31 August 2024 18:44:38 -0400 (0:00:00.057) 0:09:06.389 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 31 August 2024 18:44:38 -0400 (0:00:00.053) 0:09:06.443 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 31 August 2024 18:44:39 -0400 (0:00:00.857) 0:09:07.300 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 31 August 2024 18:44:40 -0400 (0:00:01.301) 0:09:08.602 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 31 August 2024 18:44:40 -0400 (0:00:00.082) 0:09:08.684 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 31 August 2024 18:44:40 -0400 (0:00:00.090) 0:09:08.775 ******* changed: [managed_node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 31 August 2024 18:44:52 -0400 (0:00:11.168) 0:09:19.944 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 31 August 2024 18:44:52 -0400 (0:00:00.035) 0:09:19.980 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144236.7374115, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "3fceedeef6c619b69ada96279531b69ed89734ba", "ctime": 1725144236.7344115, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263550, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1725144236.7344115, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1279, "uid": 0, "version": "18446744071584693682", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 31 August 2024 18:44:52 -0400 (0:00:00.354) 0:09:20.335 ******* ok: [managed_node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 31 August 2024 18:44:52 -0400 (0:00:00.338) 0:09:20.673 ******* TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 31 August 2024 18:44:52 -0400 (0:00:00.031) 0:09:20.705 ******* ok: [managed_node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 31 August 2024 18:44:52 -0400 (0:00:00.051) 0:09:20.756 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 31 August 2024 18:44:52 -0400 (0:00:00.044) 0:09:20.801 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 31 August 2024 18:44:53 -0400 (0:00:00.038) 0:09:20.840 ******* changed: [managed_node2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 31 August 2024 18:44:53 -0400 (0:00:00.347) 0:09:21.187 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 31 August 2024 18:44:53 -0400 (0:00:00.497) 0:09:21.685 ******* changed: [managed_node2] => (item={u'src': u'/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 31 August 2024 18:44:54 -0400 (0:00:00.386) 0:09:22.071 ******* skipping: [managed_node2] => (item={u'src': u'/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 31 August 2024 18:44:54 -0400 (0:00:00.046) 0:09:22.117 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 31 August 2024 18:44:54 -0400 (0:00:00.531) 0:09:22.649 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144240.8624842, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1725144238.1364362, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 917510, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1725144238.135436, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744071584697334", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 31 August 2024 18:44:55 -0400 (0:00:00.421) 0:09:23.070 ******* changed: [managed_node2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 31 August 2024 18:44:55 -0400 (0:00:00.471) 0:09:23.542 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:488 Saturday 31 August 2024 18:44:56 -0400 (0:00:00.910) 0:09:24.452 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 31 August 2024 18:44:56 -0400 (0:00:00.127) 0:09:24.579 ******* ok: [managed_node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 31 August 2024 18:44:56 -0400 (0:00:00.067) 0:09:24.647 ******* skipping: [managed_node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 31 August 2024 18:44:56 -0400 (0:00:00.066) 0:09:24.714 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "ee815e44-f265-47ad-8a0c-f49650ddf8a5" }, "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "size": "4G", "type": "crypt", "uuid": "1ac4bc79-012c-4741-b150-1a0c1bf4e6bf" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "e3t0qk-oSfp-di6k-1NWD-tOt2-EOV5-cOetCf" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 31 August 2024 18:44:57 -0400 (0:00:00.377) 0:09:25.091 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003330", "end": "2024-08-31 18:44:57.557861", "rc": 0, "start": "2024-08-31 18:44:57.554531" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 31 August 2024 18:44:57 -0400 (0:00:00.370) 0:09:25.462 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003138", "end": "2024-08-31 18:44:58.040032", "failed_when_result": false, "rc": 0, "start": "2024-08-31 18:44:58.036894" } STDOUT: luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 31 August 2024 18:44:58 -0400 (0:00:00.455) 0:09:25.917 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed_node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 31 August 2024 18:44:58 -0400 (0:00:00.089) 0:09:26.007 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 31 August 2024 18:44:58 -0400 (0:00:00.040) 0:09:26.048 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.021597", "end": "2024-08-31 18:44:58.523999", "rc": 0, "start": "2024-08-31 18:44:58.502402" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 31 August 2024 18:44:58 -0400 (0:00:00.385) 0:09:26.433 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 31 August 2024 18:44:58 -0400 (0:00:00.077) 0:09:26.511 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 31 August 2024 18:44:58 -0400 (0:00:00.109) 0:09:26.621 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 31 August 2024 18:44:58 -0400 (0:00:00.070) 0:09:26.691 ******* ok: [managed_node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 31 August 2024 18:44:59 -0400 (0:00:00.395) 0:09:27.086 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 31 August 2024 18:44:59 -0400 (0:00:00.054) 0:09:27.141 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 31 August 2024 18:44:59 -0400 (0:00:00.059) 0:09:27.201 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 31 August 2024 18:44:59 -0400 (0:00:00.069) 0:09:27.271 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 31 August 2024 18:44:59 -0400 (0:00:00.069) 0:09:27.341 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 31 August 2024 18:44:59 -0400 (0:00:00.054) 0:09:27.395 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 31 August 2024 18:44:59 -0400 (0:00:00.041) 0:09:27.437 ******* ok: [managed_node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 31 August 2024 18:44:59 -0400 (0:00:00.057) 0:09:27.495 ******* ok: [managed_node2] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.12.24 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 31 August 2024 18:44:59 -0400 (0:00:00.289) 0:09:27.784 ******* skipping: [managed_node2] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 31 August 2024 18:45:00 -0400 (0:00:00.046) 0:09:27.830 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed_node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 31 August 2024 18:45:00 -0400 (0:00:00.073) 0:09:27.904 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 31 August 2024 18:45:00 -0400 (0:00:00.045) 0:09:27.949 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 31 August 2024 18:45:00 -0400 (0:00:00.051) 0:09:28.000 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 31 August 2024 18:45:00 -0400 (0:00:00.053) 0:09:28.054 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 31 August 2024 18:45:00 -0400 (0:00:00.082) 0:09:28.136 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 31 August 2024 18:45:00 -0400 (0:00:00.052) 0:09:28.189 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 31 August 2024 18:45:00 -0400 (0:00:00.052) 0:09:28.242 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 31 August 2024 18:45:00 -0400 (0:00:00.056) 0:09:28.299 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 31 August 2024 18:45:00 -0400 (0:00:00.057) 0:09:28.356 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 31 August 2024 18:45:00 -0400 (0:00:00.055) 0:09:28.411 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 31 August 2024 18:45:00 -0400 (0:00:00.055) 0:09:28.467 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 31 August 2024 18:45:00 -0400 (0:00:00.055) 0:09:28.522 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed_node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 31 August 2024 18:45:00 -0400 (0:00:00.199) 0:09:28.722 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed_node2 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Saturday 31 August 2024 18:45:01 -0400 (0:00:00.123) 0:09:28.845 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Saturday 31 August 2024 18:45:01 -0400 (0:00:00.056) 0:09:28.901 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Saturday 31 August 2024 18:45:01 -0400 (0:00:00.055) 0:09:28.957 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Saturday 31 August 2024 18:45:01 -0400 (0:00:00.054) 0:09:29.012 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Saturday 31 August 2024 18:45:01 -0400 (0:00:00.058) 0:09:29.071 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Saturday 31 August 2024 18:45:01 -0400 (0:00:00.057) 0:09:29.128 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Saturday 31 August 2024 18:45:01 -0400 (0:00:00.070) 0:09:29.198 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 31 August 2024 18:45:01 -0400 (0:00:00.053) 0:09:29.252 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed_node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 31 August 2024 18:45:01 -0400 (0:00:00.127) 0:09:29.379 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed_node2 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Saturday 31 August 2024 18:45:01 -0400 (0:00:00.120) 0:09:29.500 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Saturday 31 August 2024 18:45:01 -0400 (0:00:00.055) 0:09:29.556 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Saturday 31 August 2024 18:45:01 -0400 (0:00:00.055) 0:09:29.611 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Saturday 31 August 2024 18:45:01 -0400 (0:00:00.039) 0:09:29.651 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 31 August 2024 18:45:01 -0400 (0:00:00.038) 0:09:29.690 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed_node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 31 August 2024 18:45:01 -0400 (0:00:00.090) 0:09:29.780 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 31 August 2024 18:45:02 -0400 (0:00:00.045) 0:09:29.826 ******* skipping: [managed_node2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 31 August 2024 18:45:02 -0400 (0:00:00.042) 0:09:29.868 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed_node2 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Saturday 31 August 2024 18:45:02 -0400 (0:00:00.074) 0:09:29.943 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Saturday 31 August 2024 18:45:02 -0400 (0:00:00.045) 0:09:29.988 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Saturday 31 August 2024 18:45:02 -0400 (0:00:00.053) 0:09:30.042 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Saturday 31 August 2024 18:45:02 -0400 (0:00:00.036) 0:09:30.078 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Saturday 31 August 2024 18:45:02 -0400 (0:00:00.034) 0:09:30.113 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Saturday 31 August 2024 18:45:02 -0400 (0:00:00.034) 0:09:30.147 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 31 August 2024 18:45:02 -0400 (0:00:00.038) 0:09:30.186 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 31 August 2024 18:45:02 -0400 (0:00:00.073) 0:09:30.260 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed_node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 31 August 2024 18:45:02 -0400 (0:00:00.198) 0:09:30.458 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed_node2 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Saturday 31 August 2024 18:45:02 -0400 (0:00:00.141) 0:09:30.600 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Saturday 31 August 2024 18:45:02 -0400 (0:00:00.076) 0:09:30.677 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Saturday 31 August 2024 18:45:02 -0400 (0:00:00.064) 0:09:30.741 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Saturday 31 August 2024 18:45:03 -0400 (0:00:00.084) 0:09:30.825 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Saturday 31 August 2024 18:45:03 -0400 (0:00:00.065) 0:09:30.891 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Saturday 31 August 2024 18:45:03 -0400 (0:00:00.053) 0:09:30.944 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Saturday 31 August 2024 18:45:03 -0400 (0:00:00.047) 0:09:30.992 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 31 August 2024 18:45:03 -0400 (0:00:00.059) 0:09:31.052 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed_node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 31 August 2024 18:45:03 -0400 (0:00:00.138) 0:09:31.190 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 31 August 2024 18:45:03 -0400 (0:00:00.049) 0:09:31.240 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 31 August 2024 18:45:03 -0400 (0:00:00.050) 0:09:31.290 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 31 August 2024 18:45:03 -0400 (0:00:00.042) 0:09:31.333 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 31 August 2024 18:45:03 -0400 (0:00:00.034) 0:09:31.368 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 31 August 2024 18:45:03 -0400 (0:00:00.034) 0:09:31.402 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 31 August 2024 18:45:03 -0400 (0:00:00.045) 0:09:31.447 ******* ok: [managed_node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 31 August 2024 18:45:03 -0400 (0:00:00.062) 0:09:31.510 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed_node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 31 August 2024 18:45:03 -0400 (0:00:00.139) 0:09:31.649 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 31 August 2024 18:45:03 -0400 (0:00:00.090) 0:09:31.740 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed_node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 31 August 2024 18:45:04 -0400 (0:00:00.470) 0:09:32.210 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 31 August 2024 18:45:04 -0400 (0:00:00.073) 0:09:32.284 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 31 August 2024 18:45:04 -0400 (0:00:00.074) 0:09:32.359 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 31 August 2024 18:45:04 -0400 (0:00:00.058) 0:09:32.418 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 31 August 2024 18:45:04 -0400 (0:00:00.065) 0:09:32.483 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 31 August 2024 18:45:04 -0400 (0:00:00.079) 0:09:32.563 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 31 August 2024 18:45:04 -0400 (0:00:00.098) 0:09:32.661 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 31 August 2024 18:45:04 -0400 (0:00:00.056) 0:09:32.718 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 31 August 2024 18:45:04 -0400 (0:00:00.067) 0:09:32.786 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 31 August 2024 18:45:05 -0400 (0:00:00.060) 0:09:32.846 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 31 August 2024 18:45:05 -0400 (0:00:00.066) 0:09:32.913 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 31 August 2024 18:45:05 -0400 (0:00:00.063) 0:09:32.976 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 31 August 2024 18:45:05 -0400 (0:00:00.096) 0:09:33.072 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 31 August 2024 18:45:05 -0400 (0:00:00.066) 0:09:33.139 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 31 August 2024 18:45:05 -0400 (0:00:00.068) 0:09:33.207 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 31 August 2024 18:45:05 -0400 (0:00:00.056) 0:09:33.264 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 31 August 2024 18:45:05 -0400 (0:00:00.076) 0:09:33.340 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 31 August 2024 18:45:05 -0400 (0:00:00.071) 0:09:33.412 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 31 August 2024 18:45:05 -0400 (0:00:00.098) 0:09:33.510 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 31 August 2024 18:45:05 -0400 (0:00:00.090) 0:09:33.601 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144291.8593817, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1725144291.8593817, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 247755, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1725144291.8593817, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 31 August 2024 18:45:06 -0400 (0:00:00.481) 0:09:34.082 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 31 August 2024 18:45:06 -0400 (0:00:00.067) 0:09:34.149 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 31 August 2024 18:45:06 -0400 (0:00:00.060) 0:09:34.210 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 31 August 2024 18:45:06 -0400 (0:00:00.095) 0:09:34.305 ******* ok: [managed_node2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 31 August 2024 18:45:06 -0400 (0:00:00.066) 0:09:34.372 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 31 August 2024 18:45:06 -0400 (0:00:00.059) 0:09:34.432 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 31 August 2024 18:45:06 -0400 (0:00:00.084) 0:09:34.516 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144292.0063844, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1725144292.0063844, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 259676, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1725144292.0063844, "nlink": 1, "path": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 31 August 2024 18:45:07 -0400 (0:00:00.517) 0:09:35.034 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 31 August 2024 18:45:08 -0400 (0:00:00.895) 0:09:35.929 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.028142", "end": "2024-08-31 18:45:08.617281", "rc": 0, "start": "2024-08-31 18:45:08.589139" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: fb 9b cf 67 32 26 91 18 5d 1c fd 79 c6 b8 c7 9c 7f e7 50 96 MK salt: 67 a5 a1 dd 37 a2 2c d8 b9 1b fb ba 5c 91 be f4 ea 7b c5 4d e7 e9 4d 93 90 3b 88 7d f9 b6 45 75 MK iterations: 20582 UUID: ee815e44-f265-47ad-8a0c-f49650ddf8a5 Key Slot 0: ENABLED Iterations: 329326 Salt: 00 eb a7 e5 52 e6 09 27 7c c7 80 52 2e ed ff e2 79 4f 99 15 ec 8e 5d a7 5a e8 b9 3a 4b 0d 53 4b Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 31 August 2024 18:45:08 -0400 (0:00:00.583) 0:09:36.513 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 31 August 2024 18:45:08 -0400 (0:00:00.064) 0:09:36.577 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 31 August 2024 18:45:08 -0400 (0:00:00.061) 0:09:36.639 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 31 August 2024 18:45:08 -0400 (0:00:00.047) 0:09:36.686 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 31 August 2024 18:45:08 -0400 (0:00:00.047) 0:09:36.734 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 31 August 2024 18:45:08 -0400 (0:00:00.047) 0:09:36.781 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 31 August 2024 18:45:09 -0400 (0:00:00.055) 0:09:36.837 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 31 August 2024 18:45:09 -0400 (0:00:00.046) 0:09:36.883 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 31 August 2024 18:45:09 -0400 (0:00:00.057) 0:09:36.941 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 31 August 2024 18:45:09 -0400 (0:00:00.053) 0:09:36.994 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 31 August 2024 18:45:09 -0400 (0:00:00.061) 0:09:37.056 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 31 August 2024 18:45:09 -0400 (0:00:00.057) 0:09:37.113 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 31 August 2024 18:45:09 -0400 (0:00:00.055) 0:09:37.168 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 31 August 2024 18:45:09 -0400 (0:00:00.053) 0:09:37.221 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 31 August 2024 18:45:09 -0400 (0:00:00.044) 0:09:37.265 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 31 August 2024 18:45:09 -0400 (0:00:00.039) 0:09:37.305 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 31 August 2024 18:45:09 -0400 (0:00:00.045) 0:09:37.350 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 31 August 2024 18:45:09 -0400 (0:00:00.042) 0:09:37.392 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 31 August 2024 18:45:09 -0400 (0:00:00.037) 0:09:37.429 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 31 August 2024 18:45:09 -0400 (0:00:00.034) 0:09:37.464 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 31 August 2024 18:45:09 -0400 (0:00:00.033) 0:09:37.498 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 31 August 2024 18:45:09 -0400 (0:00:00.039) 0:09:37.537 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 31 August 2024 18:45:09 -0400 (0:00:00.056) 0:09:37.594 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 31 August 2024 18:45:09 -0400 (0:00:00.087) 0:09:37.682 ******* ok: [managed_node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 31 August 2024 18:45:10 -0400 (0:00:00.373) 0:09:38.055 ******* ok: [managed_node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 31 August 2024 18:45:10 -0400 (0:00:00.372) 0:09:38.428 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 31 August 2024 18:45:10 -0400 (0:00:00.059) 0:09:38.487 ******* ok: [managed_node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 31 August 2024 18:45:10 -0400 (0:00:00.045) 0:09:38.532 ******* ok: [managed_node2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 31 August 2024 18:45:11 -0400 (0:00:00.380) 0:09:38.913 ******* skipping: [managed_node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 31 August 2024 18:45:11 -0400 (0:00:00.064) 0:09:38.978 ******* skipping: [managed_node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 31 August 2024 18:45:11 -0400 (0:00:00.061) 0:09:39.039 ******* skipping: [managed_node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 31 August 2024 18:45:11 -0400 (0:00:00.057) 0:09:39.097 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 31 August 2024 18:45:11 -0400 (0:00:00.051) 0:09:39.148 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 31 August 2024 18:45:11 -0400 (0:00:00.046) 0:09:39.195 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 31 August 2024 18:45:11 -0400 (0:00:00.041) 0:09:39.237 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 31 August 2024 18:45:11 -0400 (0:00:00.035) 0:09:39.272 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 31 August 2024 18:45:11 -0400 (0:00:00.042) 0:09:39.315 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 31 August 2024 18:45:11 -0400 (0:00:00.051) 0:09:39.367 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 31 August 2024 18:45:11 -0400 (0:00:00.052) 0:09:39.419 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 31 August 2024 18:45:11 -0400 (0:00:00.056) 0:09:39.476 ******* skipping: [managed_node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 31 August 2024 18:45:11 -0400 (0:00:00.054) 0:09:39.530 ******* skipping: [managed_node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 31 August 2024 18:45:11 -0400 (0:00:00.051) 0:09:39.582 ******* skipping: [managed_node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 31 August 2024 18:45:11 -0400 (0:00:00.051) 0:09:39.633 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 31 August 2024 18:45:11 -0400 (0:00:00.051) 0:09:39.684 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 31 August 2024 18:45:11 -0400 (0:00:00.060) 0:09:39.745 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 31 August 2024 18:45:11 -0400 (0:00:00.055) 0:09:39.801 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 31 August 2024 18:45:12 -0400 (0:00:00.051) 0:09:39.852 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 31 August 2024 18:45:12 -0400 (0:00:00.052) 0:09:39.904 ******* ok: [managed_node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 31 August 2024 18:45:12 -0400 (0:00:00.063) 0:09:39.968 ******* ok: [managed_node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 31 August 2024 18:45:12 -0400 (0:00:00.058) 0:09:40.026 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 31 August 2024 18:45:12 -0400 (0:00:00.070) 0:09:40.097 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.021770", "end": "2024-08-31 18:45:12.674900", "rc": 0, "start": "2024-08-31 18:45:12.653130" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 31 August 2024 18:45:12 -0400 (0:00:00.482) 0:09:40.580 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 31 August 2024 18:45:12 -0400 (0:00:00.067) 0:09:40.647 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 31 August 2024 18:45:12 -0400 (0:00:00.086) 0:09:40.733 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 31 August 2024 18:45:13 -0400 (0:00:00.140) 0:09:40.874 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 31 August 2024 18:45:13 -0400 (0:00:00.074) 0:09:40.948 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 31 August 2024 18:45:13 -0400 (0:00:00.113) 0:09:41.062 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 31 August 2024 18:45:13 -0400 (0:00:00.090) 0:09:41.153 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 31 August 2024 18:45:13 -0400 (0:00:00.059) 0:09:41.213 ******* TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 31 August 2024 18:45:13 -0400 (0:00:00.051) 0:09:41.264 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:491 Saturday 31 August 2024 18:45:13 -0400 (0:00:00.058) 0:09:41.322 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 31 August 2024 18:45:13 -0400 (0:00:00.214) 0:09:41.537 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 31 August 2024 18:45:13 -0400 (0:00:00.119) 0:09:41.656 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 31 August 2024 18:45:13 -0400 (0:00:00.065) 0:09:41.721 ******* skipping: [managed_node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed_node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed_node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed_node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 31 August 2024 18:45:14 -0400 (0:00:00.141) 0:09:41.863 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 31 August 2024 18:45:14 -0400 (0:00:00.077) 0:09:41.941 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 31 August 2024 18:45:14 -0400 (0:00:00.069) 0:09:42.010 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 31 August 2024 18:45:14 -0400 (0:00:00.059) 0:09:42.070 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 31 August 2024 18:45:14 -0400 (0:00:00.055) 0:09:42.125 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 31 August 2024 18:45:14 -0400 (0:00:00.143) 0:09:42.268 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 31 August 2024 18:45:16 -0400 (0:00:01.577) 0:09:43.846 ******* ok: [managed_node2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 31 August 2024 18:45:16 -0400 (0:00:00.056) 0:09:43.902 ******* ok: [managed_node2] => { "storage_volumes": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 31 August 2024 18:45:16 -0400 (0:00:00.079) 0:09:43.982 ******* ok: [managed_node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 31 August 2024 18:45:20 -0400 (0:00:04.246) 0:09:48.229 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed_node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 31 August 2024 18:45:20 -0400 (0:00:00.111) 0:09:48.340 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 31 August 2024 18:45:20 -0400 (0:00:00.069) 0:09:48.410 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 31 August 2024 18:45:20 -0400 (0:00:00.227) 0:09:48.637 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 31 August 2024 18:45:20 -0400 (0:00:00.116) 0:09:48.754 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 31 August 2024 18:45:21 -0400 (0:00:00.917) 0:09:49.671 ******* ok: [managed_node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 31 August 2024 18:45:23 -0400 (0:00:01.448) 0:09:51.120 ******* ok: [managed_node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 31 August 2024 18:45:23 -0400 (0:00:00.080) 0:09:51.201 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 31 August 2024 18:45:23 -0400 (0:00:00.046) 0:09:51.247 ******* changed: [managed_node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=e3t0qk-oSfp-di6k-1NWD-tOt2-EOV5-cOetCf", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 31 August 2024 18:45:58 -0400 (0:00:35.284) 0:10:26.532 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 31 August 2024 18:45:58 -0400 (0:00:00.053) 0:10:26.586 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144294.1664224, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ad750f049c7b6fa187b6289e17347181c4a55a3e", "ctime": 1725144294.1634223, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263550, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1725144294.1634223, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744071584693682", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 31 August 2024 18:45:59 -0400 (0:00:00.698) 0:10:27.284 ******* ok: [managed_node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 31 August 2024 18:46:00 -0400 (0:00:00.564) 0:10:27.848 ******* TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 31 August 2024 18:46:00 -0400 (0:00:00.057) 0:10:27.905 ******* ok: [managed_node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=e3t0qk-oSfp-di6k-1NWD-tOt2-EOV5-cOetCf", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 31 August 2024 18:46:00 -0400 (0:00:00.080) 0:10:27.986 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 31 August 2024 18:46:00 -0400 (0:00:00.050) 0:10:28.036 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=e3t0qk-oSfp-di6k-1NWD-tOt2-EOV5-cOetCf", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 31 August 2024 18:46:00 -0400 (0:00:00.054) 0:10:28.091 ******* changed: [managed_node2] => (item={u'src': u'/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 31 August 2024 18:46:00 -0400 (0:00:00.392) 0:10:28.484 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 31 August 2024 18:46:01 -0400 (0:00:00.556) 0:10:29.040 ******* TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 31 August 2024 18:46:01 -0400 (0:00:00.049) 0:10:29.090 ******* TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 31 August 2024 18:46:01 -0400 (0:00:00.051) 0:10:29.142 ******* ok: [managed_node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 31 August 2024 18:46:01 -0400 (0:00:00.617) 0:10:29.759 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144298.0384903, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "549bbe845bb53fab49a6c992e00c05a9291188db", "ctime": 1725144295.5854473, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 917511, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1725144295.5844471, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "18446744071584697529", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 31 August 2024 18:46:02 -0400 (0:00:00.380) 0:10:30.139 ******* changed: [managed_node2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-ee815e44-f265-47ad-8a0c-f49650ddf8a5", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 31 August 2024 18:46:02 -0400 (0:00:00.414) 0:10:30.554 ******* ok: [managed_node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:501 Saturday 31 August 2024 18:46:03 -0400 (0:00:00.798) 0:10:31.353 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed_node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 31 August 2024 18:46:03 -0400 (0:00:00.129) 0:10:31.482 ******* skipping: [managed_node2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 31 August 2024 18:46:03 -0400 (0:00:00.054) 0:10:31.537 ******* ok: [managed_node2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=e3t0qk-oSfp-di6k-1NWD-tOt2-EOV5-cOetCf", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 31 August 2024 18:46:03 -0400 (0:00:00.064) 0:10:31.602 ******* ok: [managed_node2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 31 August 2024 18:46:04 -0400 (0:00:00.373) 0:10:31.976 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003329", "end": "2024-08-31 18:46:04.450802", "rc": 0, "start": "2024-08-31 18:46:04.447473" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 31 August 2024 18:46:04 -0400 (0:00:00.365) 0:10:32.342 ******* ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003199", "end": "2024-08-31 18:46:04.860127", "failed_when_result": false, "rc": 0, "start": "2024-08-31 18:46:04.856928" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 31 August 2024 18:46:04 -0400 (0:00:00.439) 0:10:32.782 ******* TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 31 August 2024 18:46:05 -0400 (0:00:00.047) 0:10:32.829 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed_node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 31 August 2024 18:46:05 -0400 (0:00:00.100) 0:10:32.929 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 31 August 2024 18:46:05 -0400 (0:00:00.054) 0:10:32.984 ******* included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed_node2 included: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed_node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 31 August 2024 18:46:05 -0400 (0:00:00.188) 0:10:33.172 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 31 August 2024 18:46:05 -0400 (0:00:00.047) 0:10:33.220 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 31 August 2024 18:46:05 -0400 (0:00:00.059) 0:10:33.279 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 31 August 2024 18:46:05 -0400 (0:00:00.053) 0:10:33.333 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 31 August 2024 18:46:05 -0400 (0:00:00.056) 0:10:33.389 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 31 August 2024 18:46:05 -0400 (0:00:00.044) 0:10:33.434 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 31 August 2024 18:46:05 -0400 (0:00:00.049) 0:10:33.483 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 31 August 2024 18:46:05 -0400 (0:00:00.043) 0:10:33.527 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 31 August 2024 18:46:05 -0400 (0:00:00.035) 0:10:33.563 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 31 August 2024 18:46:05 -0400 (0:00:00.034) 0:10:33.597 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 31 August 2024 18:46:05 -0400 (0:00:00.038) 0:10:33.636 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 31 August 2024 18:46:05 -0400 (0:00:00.047) 0:10:33.683 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 31 August 2024 18:46:05 -0400 (0:00:00.092) 0:10:33.775 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 31 August 2024 18:46:06 -0400 (0:00:00.049) 0:10:33.825 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 31 August 2024 18:46:06 -0400 (0:00:00.059) 0:10:33.884 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 31 August 2024 18:46:06 -0400 (0:00:00.051) 0:10:33.936 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 31 August 2024 18:46:06 -0400 (0:00:00.081) 0:10:34.018 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 31 August 2024 18:46:06 -0400 (0:00:00.104) 0:10:34.122 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 31 August 2024 18:46:06 -0400 (0:00:00.091) 0:10:34.214 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 31 August 2024 18:46:06 -0400 (0:00:00.069) 0:10:34.284 ******* ok: [managed_node2] => { "changed": false, "stat": { "atime": 1725144358.5505533, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1725144358.5505533, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 27840, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1725144358.5505533, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 31 August 2024 18:46:06 -0400 (0:00:00.438) 0:10:34.723 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 31 August 2024 18:46:06 -0400 (0:00:00.062) 0:10:34.785 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 31 August 2024 18:46:07 -0400 (0:00:00.053) 0:10:34.838 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 31 August 2024 18:46:07 -0400 (0:00:00.036) 0:10:34.875 ******* ok: [managed_node2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 31 August 2024 18:46:07 -0400 (0:00:00.050) 0:10:34.925 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 31 August 2024 18:46:07 -0400 (0:00:00.061) 0:10:34.987 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 31 August 2024 18:46:07 -0400 (0:00:00.044) 0:10:35.031 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 31 August 2024 18:46:07 -0400 (0:00:00.047) 0:10:35.078 ******* ok: [managed_node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 31 August 2024 18:46:08 -0400 (0:00:00.804) 0:10:35.883 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 31 August 2024 18:46:08 -0400 (0:00:00.055) 0:10:35.938 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 31 August 2024 18:46:08 -0400 (0:00:00.049) 0:10:35.988 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 31 August 2024 18:46:08 -0400 (0:00:00.045) 0:10:36.034 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 31 August 2024 18:46:08 -0400 (0:00:00.086) 0:10:36.120 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 31 August 2024 18:46:08 -0400 (0:00:00.068) 0:10:36.189 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 31 August 2024 18:46:08 -0400 (0:00:00.052) 0:10:36.242 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 31 August 2024 18:46:08 -0400 (0:00:00.047) 0:10:36.289 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 31 August 2024 18:46:08 -0400 (0:00:00.053) 0:10:36.343 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 31 August 2024 18:46:08 -0400 (0:00:00.064) 0:10:36.408 ******* ok: [managed_node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 31 August 2024 18:46:08 -0400 (0:00:00.069) 0:10:36.477 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 31 August 2024 18:46:08 -0400 (0:00:00.067) 0:10:36.545 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 31 August 2024 18:46:08 -0400 (0:00:00.052) 0:10:36.598 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 31 August 2024 18:46:08 -0400 (0:00:00.060) 0:10:36.659 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 31 August 2024 18:46:08 -0400 (0:00:00.054) 0:10:36.713 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 31 August 2024 18:46:08 -0400 (0:00:00.050) 0:10:36.764 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 31 August 2024 18:46:08 -0400 (0:00:00.049) 0:10:36.813 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.052) 0:10:36.866 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.049) 0:10:36.916 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.055) 0:10:36.971 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.112) 0:10:37.083 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.046) 0:10:37.130 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.038) 0:10:37.168 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.033) 0:10:37.202 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.033) 0:10:37.235 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.032) 0:10:37.267 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.034) 0:10:37.302 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.034) 0:10:37.337 ******* ok: [managed_node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.035) 0:10:37.373 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.032) 0:10:37.405 ******* skipping: [managed_node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.032) 0:10:37.437 ******* skipping: [managed_node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.032) 0:10:37.470 ******* skipping: [managed_node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.035) 0:10:37.505 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.032) 0:10:37.538 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.043) 0:10:37.581 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.032) 0:10:37.614 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.031) 0:10:37.646 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.032) 0:10:37.678 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.033) 0:10:37.712 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.032) 0:10:37.744 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.033) 0:10:37.777 ******* skipping: [managed_node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 31 August 2024 18:46:09 -0400 (0:00:00.032) 0:10:37.810 ******* skipping: [managed_node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 31 August 2024 18:46:10 -0400 (0:00:00.032) 0:10:37.843 ******* skipping: [managed_node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 31 August 2024 18:46:10 -0400 (0:00:00.032) 0:10:37.875 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 31 August 2024 18:46:10 -0400 (0:00:00.035) 0:10:37.911 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 31 August 2024 18:46:10 -0400 (0:00:00.033) 0:10:37.945 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 31 August 2024 18:46:10 -0400 (0:00:00.033) 0:10:37.979 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 31 August 2024 18:46:10 -0400 (0:00:00.033) 0:10:38.012 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 31 August 2024 18:46:10 -0400 (0:00:00.033) 0:10:38.045 ******* ok: [managed_node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 31 August 2024 18:46:10 -0400 (0:00:00.037) 0:10:38.083 ******* ok: [managed_node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 31 August 2024 18:46:10 -0400 (0:00:00.039) 0:10:38.123 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 31 August 2024 18:46:10 -0400 (0:00:00.033) 0:10:38.156 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 31 August 2024 18:46:10 -0400 (0:00:00.032) 0:10:38.189 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 31 August 2024 18:46:10 -0400 (0:00:00.032) 0:10:38.222 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 31 August 2024 18:46:10 -0400 (0:00:00.033) 0:10:38.255 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 31 August 2024 18:46:10 -0400 (0:00:00.034) 0:10:38.290 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 31 August 2024 18:46:10 -0400 (0:00:00.036) 0:10:38.327 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 31 August 2024 18:46:10 -0400 (0:00:00.034) 0:10:38.361 ******* skipping: [managed_node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 31 August 2024 18:46:10 -0400 (0:00:00.034) 0:10:38.396 ******* ok: [managed_node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 31 August 2024 18:46:10 -0400 (0:00:00.034) 0:10:38.430 ******* ok: [managed_node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* managed_node2 : ok=1226 changed=60 unreachable=0 failed=9 skipped=1064 rescued=9 ignored=0 Saturday 31 August 2024 18:46:10 -0400 (0:00:00.017) 0:10:38.448 ******* =============================================================================== fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 65.17s /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 35.28s /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.72s /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.22s /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.17s /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.13s /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 10.78s /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 10.63s /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.74s /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.74s /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.70s /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.55s /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.47s /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Get required packages --------------- 4.46s /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.45s /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Get required packages --------------- 4.45s /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Get required packages --------------- 4.43s /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Get required packages --------------- 4.40s /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Get required packages --------------- 4.35s /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Get required packages --------------- 4.33s /tmp/collections-fOI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19