ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, Nov 14 2023, 16:14:06) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file [WARNING]: running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_luks.yml ******************************************************* 1 plays in /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml PLAY [Test LUKS] *************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:2 Friday 17 January 2025 04:40:08 -0500 (0:00:00.032) 0:00:00.032 ******** ok: [managed-node1] META: ran handlers TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:20 Friday 17 January 2025 04:40:09 -0500 (0:00:01.107) 0:00:01.140 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:28 Friday 17 January 2025 04:40:09 -0500 (0:00:00.070) 0:00:01.210 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:39 Friday 17 January 2025 04:40:09 -0500 (0:00:00.075) 0:00:01.286 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:43 Friday 17 January 2025 04:40:09 -0500 (0:00:00.105) 0:00:01.391 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:53 Friday 17 January 2025 04:40:09 -0500 (0:00:00.090) 0:00:01.482 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:59 Friday 17 January 2025 04:40:09 -0500 (0:00:00.088) 0:00:01.571 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:68 Friday 17 January 2025 04:40:09 -0500 (0:00:00.074) 0:00:01.646 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:72 Friday 17 January 2025 04:40:09 -0500 (0:00:00.079) 0:00:01.725 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:40:09 -0500 (0:00:00.137) 0:00:01.862 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:40:09 -0500 (0:00:00.087) 0:00:01.950 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:40:10 -0500 (0:00:00.095) 0:00:02.045 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:40:10 -0500 (0:00:00.150) 0:00:02.196 ******** ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:40:10 -0500 (0:00:00.744) 0:00:02.940 ******** ok: [managed-node1] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:40:11 -0500 (0:00:00.080) 0:00:03.021 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:40:11 -0500 (0:00:00.040) 0:00:03.062 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:40:11 -0500 (0:00:00.039) 0:00:03.101 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:40:11 -0500 (0:00:00.121) 0:00:03.223 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:40:15 -0500 (0:00:04.066) 0:00:07.290 ******** ok: [managed-node1] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:40:15 -0500 (0:00:00.062) 0:00:07.352 ******** ok: [managed-node1] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:40:15 -0500 (0:00:00.057) 0:00:07.410 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:40:16 -0500 (0:00:00.790) 0:00:08.200 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:40:16 -0500 (0:00:00.090) 0:00:08.290 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:40:16 -0500 (0:00:00.023) 0:00:08.314 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:40:16 -0500 (0:00:00.032) 0:00:08.347 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:40:16 -0500 (0:00:00.045) 0:00:08.393 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:40:17 -0500 (0:00:00.732) 0:00:09.125 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:40:18 -0500 (0:00:01.162) 0:00:10.288 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:40:18 -0500 (0:00:00.048) 0:00:10.336 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:40:18 -0500 (0:00:00.031) 0:00:10.368 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:40:18 -0500 (0:00:00.514) 0:00:10.883 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:40:18 -0500 (0:00:00.088) 0:00:10.971 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737106790.7644799, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "72884e3f126482c2d28276ff7c57744fa95eff91", "ctime": 1737106790.3414795, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737106790.3414795, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1229, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:40:19 -0500 (0:00:00.440) 0:00:11.412 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:40:19 -0500 (0:00:00.061) 0:00:11.473 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:40:19 -0500 (0:00:00.064) 0:00:11.537 ******** ok: [managed-node1] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:40:19 -0500 (0:00:00.082) 0:00:11.620 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:40:19 -0500 (0:00:00.138) 0:00:11.759 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:40:19 -0500 (0:00:00.054) 0:00:11.813 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:40:19 -0500 (0:00:00.059) 0:00:11.872 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:40:19 -0500 (0:00:00.057) 0:00:11.929 ******** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:40:19 -0500 (0:00:00.051) 0:00:11.980 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:40:20 -0500 (0:00:00.052) 0:00:12.033 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:40:20 -0500 (0:00:00.064) 0:00:12.098 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737106724.7414114, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1737106722.514409, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263653, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1737106722.5134091, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744072031198583", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:40:20 -0500 (0:00:00.526) 0:00:12.624 ******** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:40:20 -0500 (0:00:00.086) 0:00:12.710 ******** ok: [managed-node1] TASK [Get unused disks] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:76 Friday 17 January 2025 04:40:21 -0500 (0:00:00.876) 0:00:13.587 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node1 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Friday 17 January 2025 04:40:21 -0500 (0:00:00.101) 0:00:13.689 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Friday 17 January 2025 04:40:22 -0500 (0:00:01.012) 0:00:14.701 ******** ok: [managed-node1] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"ext4\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"ext4\" LOG-SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'fstype': '', 'type': 'disk', 'ssize': '512', 'size': '268435456000'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Friday 17 January 2025 04:40:23 -0500 (0:00:00.666) 0:00:15.367 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Friday 17 January 2025 04:40:23 -0500 (0:00:00.049) 0:00:15.417 ******** ok: [managed-node1] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Friday 17 January 2025 04:40:23 -0500 (0:00:00.054) 0:00:15.472 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Friday 17 January 2025 04:40:23 -0500 (0:00:00.048) 0:00:15.520 ******** ok: [managed-node1] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:85 Friday 17 January 2025 04:40:23 -0500 (0:00:00.043) 0:00:15.564 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node1 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 January 2025 04:40:23 -0500 (0:00:00.083) 0:00:15.647 ******** ok: [managed-node1] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 January 2025 04:40:23 -0500 (0:00:00.064) 0:00:15.712 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:40:23 -0500 (0:00:00.100) 0:00:15.812 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:40:23 -0500 (0:00:00.086) 0:00:15.899 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:40:23 -0500 (0:00:00.065) 0:00:15.965 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:40:24 -0500 (0:00:00.142) 0:00:16.107 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:40:24 -0500 (0:00:00.054) 0:00:16.162 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:40:24 -0500 (0:00:00.053) 0:00:16.216 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:40:24 -0500 (0:00:00.054) 0:00:16.271 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:40:24 -0500 (0:00:00.058) 0:00:16.330 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:40:24 -0500 (0:00:00.126) 0:00:16.456 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:40:28 -0500 (0:00:03.961) 0:00:20.417 ******** ok: [managed-node1] => { "storage_pools": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:40:28 -0500 (0:00:00.064) 0:00:20.481 ******** ok: [managed-node1] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:40:28 -0500 (0:00:00.064) 0:00:20.545 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:40:32 -0500 (0:00:03.922) 0:00:24.468 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:40:32 -0500 (0:00:00.102) 0:00:24.570 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:40:32 -0500 (0:00:00.054) 0:00:24.624 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:40:32 -0500 (0:00:00.054) 0:00:24.679 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:40:32 -0500 (0:00:00.051) 0:00:24.730 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:40:33 -0500 (0:00:00.745) 0:00:25.475 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:40:34 -0500 (0:00:01.020) 0:00:26.495 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:40:34 -0500 (0:00:00.075) 0:00:26.571 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:40:34 -0500 (0:00:00.048) 0:00:26.620 ******** fatal: [managed-node1]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Friday 17 January 2025 04:40:38 -0500 (0:00:03.928) 0:00:30.548 ******** fatal: [managed-node1]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'mount_user': None, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'mount_group': None, u'type': u'disk', u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'foo' missing key/password"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:40:38 -0500 (0:00:00.130) 0:00:30.678 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 17 January 2025 04:40:38 -0500 (0:00:00.053) 0:00:30.732 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 17 January 2025 04:40:38 -0500 (0:00:00.071) 0:00:30.803 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 17 January 2025 04:40:38 -0500 (0:00:00.097) 0:00:30.900 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:100 Friday 17 January 2025 04:40:38 -0500 (0:00:00.050) 0:00:30.951 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:40:39 -0500 (0:00:00.113) 0:00:31.064 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:40:39 -0500 (0:00:00.076) 0:00:31.141 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:40:39 -0500 (0:00:00.060) 0:00:31.202 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:40:39 -0500 (0:00:00.122) 0:00:31.325 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:40:39 -0500 (0:00:00.046) 0:00:31.371 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:40:39 -0500 (0:00:00.037) 0:00:31.409 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:40:39 -0500 (0:00:00.036) 0:00:31.445 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:40:39 -0500 (0:00:00.035) 0:00:31.480 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:40:39 -0500 (0:00:00.130) 0:00:31.611 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:40:43 -0500 (0:00:04.010) 0:00:35.622 ******** ok: [managed-node1] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:40:43 -0500 (0:00:00.066) 0:00:35.689 ******** ok: [managed-node1] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:40:43 -0500 (0:00:00.075) 0:00:35.765 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:40:47 -0500 (0:00:04.134) 0:00:39.899 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:40:48 -0500 (0:00:00.162) 0:00:40.062 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:40:48 -0500 (0:00:00.062) 0:00:40.125 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:40:48 -0500 (0:00:00.057) 0:00:40.182 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:40:48 -0500 (0:00:00.062) 0:00:40.244 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:40:49 -0500 (0:00:01.008) 0:00:41.253 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:40:50 -0500 (0:00:01.260) 0:00:42.513 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:40:50 -0500 (0:00:00.088) 0:00:42.602 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:40:50 -0500 (0:00:00.064) 0:00:42.667 ******** changed: [managed-node1] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:41:00 -0500 (0:00:10.124) 0:00:52.791 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:41:00 -0500 (0:00:00.055) 0:00:52.847 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737106790.7644799, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "72884e3f126482c2d28276ff7c57744fa95eff91", "ctime": 1737106790.3414795, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737106790.3414795, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1229, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:41:01 -0500 (0:00:00.414) 0:00:53.262 ******** ok: [managed-node1] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:41:01 -0500 (0:00:00.721) 0:00:53.984 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:41:02 -0500 (0:00:00.049) 0:00:54.034 ******** ok: [managed-node1] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:41:02 -0500 (0:00:00.068) 0:00:54.102 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:41:02 -0500 (0:00:00.063) 0:00:54.165 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:41:02 -0500 (0:00:00.086) 0:00:54.252 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:41:02 -0500 (0:00:00.061) 0:00:54.313 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:41:03 -0500 (0:00:01.106) 0:00:55.420 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:41:04 -0500 (0:00:00.826) 0:00:56.246 ******** skipping: [managed-node1] => (item={u'src': u'/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:41:04 -0500 (0:00:00.071) 0:00:56.318 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:41:04 -0500 (0:00:00.671) 0:00:56.989 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737106724.7414114, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1737106722.514409, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263653, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1737106722.5134091, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744072031198583", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:41:05 -0500 (0:00:00.438) 0:00:57.428 ******** changed: [managed-node1] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:41:05 -0500 (0:00:00.511) 0:00:57.940 ******** ok: [managed-node1] TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:112 Friday 17 January 2025 04:41:06 -0500 (0:00:00.950) 0:00:58.890 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:41:07 -0500 (0:00:00.166) 0:00:59.056 ******** skipping: [managed-node1] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:41:07 -0500 (0:00:00.054) 0:00:59.110 ******** ok: [managed-node1] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:41:07 -0500 (0:00:00.086) 0:00:59.197 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "size": "10G", "type": "crypt", "uuid": "03e7a520-90e4-4db4-9005-7253b079651b" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:41:07 -0500 (0:00:00.757) 0:00:59.954 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003004", "end": "2025-01-17 04:41:08.364996", "rc": 0, "start": "2025-01-17 04:41:08.361992" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:41:08 -0500 (0:00:00.522) 0:01:00.477 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002919", "end": "2025-01-17 04:41:08.733131", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:41:08.730212" } STDOUT: luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:41:08 -0500 (0:00:00.336) 0:01:00.814 ******** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:41:08 -0500 (0:00:00.032) 0:01:00.847 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:41:08 -0500 (0:00:00.075) 0:01:00.922 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:41:08 -0500 (0:00:00.044) 0:01:00.967 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:41:09 -0500 (0:00:00.265) 0:01:01.232 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:41:09 -0500 (0:00:00.068) 0:01:01.301 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:41:09 -0500 (0:00:00.064) 0:01:01.365 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:41:09 -0500 (0:00:00.046) 0:01:01.412 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:41:09 -0500 (0:00:00.043) 0:01:01.455 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:41:09 -0500 (0:00:00.036) 0:01:01.492 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:41:09 -0500 (0:00:00.039) 0:01:01.531 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:41:09 -0500 (0:00:00.039) 0:01:01.570 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:41:09 -0500 (0:00:00.046) 0:01:01.616 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:41:09 -0500 (0:00:00.056) 0:01:01.673 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:41:09 -0500 (0:00:00.056) 0:01:01.729 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:41:09 -0500 (0:00:00.058) 0:01:01.787 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:41:09 -0500 (0:00:00.099) 0:01:01.887 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:41:09 -0500 (0:00:00.067) 0:01:01.955 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:41:10 -0500 (0:00:00.067) 0:01:02.022 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:41:10 -0500 (0:00:00.056) 0:01:02.079 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:41:10 -0500 (0:00:00.073) 0:01:02.152 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:41:10 -0500 (0:00:00.060) 0:01:02.212 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:41:10 -0500 (0:00:00.080) 0:01:02.292 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:41:10 -0500 (0:00:00.079) 0:01:02.372 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737106860.5175526, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737106860.5175526, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 28267, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1737106860.5175526, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:41:10 -0500 (0:00:00.408) 0:01:02.781 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:41:10 -0500 (0:00:00.065) 0:01:02.846 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:41:10 -0500 (0:00:00.055) 0:01:02.901 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:41:10 -0500 (0:00:00.068) 0:01:02.970 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:41:11 -0500 (0:00:00.061) 0:01:03.031 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:41:11 -0500 (0:00:00.059) 0:01:03.090 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:41:11 -0500 (0:00:00.052) 0:01:03.142 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737106860.6295528, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737106860.6295528, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 348987, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737106860.6295528, "nlink": 1, "path": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:41:11 -0500 (0:00:00.345) 0:01:03.488 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:41:12 -0500 (0:00:00.622) 0:01:04.110 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.026499", "end": "2025-01-17 04:41:12.405850", "rc": 0, "start": "2025-01-17 04:41:12.379351" } STDOUT: LUKS header information for /dev/sda Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: b5 d1 c8 15 eb 1e ff c0 1d a0 0d 2c fd c8 65 b2 69 3b 00 f8 MK salt: 7a 68 b9 9d b0 25 86 b5 ad 95 8e ad fe ce b1 e9 62 e8 ed 76 36 f3 a8 bc 7e b2 b9 83 29 1c 31 36 MK iterations: 23076 UUID: 50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8 Key Slot 0: ENABLED Iterations: 368178 Salt: 0e 05 4f 03 df 88 c2 99 ec 9c e5 50 30 b8 64 25 a0 3a d5 f2 05 b2 47 50 41 0f c5 70 8c 0f 66 81 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:41:12 -0500 (0:00:00.383) 0:01:04.493 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:41:12 -0500 (0:00:00.051) 0:01:04.545 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:41:12 -0500 (0:00:00.068) 0:01:04.614 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:41:12 -0500 (0:00:00.069) 0:01:04.684 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:41:12 -0500 (0:00:00.067) 0:01:04.752 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:41:12 -0500 (0:00:00.060) 0:01:04.812 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:41:12 -0500 (0:00:00.060) 0:01:04.872 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:41:12 -0500 (0:00:00.048) 0:01:04.921 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:41:12 -0500 (0:00:00.059) 0:01:04.981 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:41:13 -0500 (0:00:00.052) 0:01:05.033 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:41:13 -0500 (0:00:00.050) 0:01:05.084 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:41:13 -0500 (0:00:00.052) 0:01:05.136 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:41:13 -0500 (0:00:00.051) 0:01:05.187 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:41:13 -0500 (0:00:00.051) 0:01:05.239 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:41:13 -0500 (0:00:00.046) 0:01:05.286 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:41:13 -0500 (0:00:00.045) 0:01:05.332 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:41:13 -0500 (0:00:00.046) 0:01:05.378 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:41:13 -0500 (0:00:00.043) 0:01:05.422 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:41:13 -0500 (0:00:00.036) 0:01:05.458 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:41:13 -0500 (0:00:00.053) 0:01:05.511 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:41:13 -0500 (0:00:00.046) 0:01:05.558 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:41:13 -0500 (0:00:00.055) 0:01:05.613 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:41:13 -0500 (0:00:00.047) 0:01:05.661 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:41:13 -0500 (0:00:00.048) 0:01:05.710 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:41:13 -0500 (0:00:00.051) 0:01:05.761 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:41:13 -0500 (0:00:00.047) 0:01:05.809 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:41:13 -0500 (0:00:00.040) 0:01:05.849 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:41:13 -0500 (0:00:00.038) 0:01:05.888 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:41:13 -0500 (0:00:00.039) 0:01:05.928 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:41:13 -0500 (0:00:00.040) 0:01:05.969 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:41:13 -0500 (0:00:00.039) 0:01:06.008 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:41:14 -0500 (0:00:00.044) 0:01:06.053 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:41:14 -0500 (0:00:00.054) 0:01:06.108 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:41:14 -0500 (0:00:00.055) 0:01:06.163 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:41:14 -0500 (0:00:00.057) 0:01:06.220 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:41:14 -0500 (0:00:00.054) 0:01:06.275 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:41:14 -0500 (0:00:00.047) 0:01:06.323 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:41:14 -0500 (0:00:00.044) 0:01:06.367 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:41:14 -0500 (0:00:00.035) 0:01:06.403 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:41:14 -0500 (0:00:00.034) 0:01:06.438 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:41:14 -0500 (0:00:00.042) 0:01:06.480 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:41:14 -0500 (0:00:00.076) 0:01:06.557 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:41:14 -0500 (0:00:00.057) 0:01:06.615 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:41:14 -0500 (0:00:00.054) 0:01:06.669 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:41:14 -0500 (0:00:00.043) 0:01:06.713 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:41:14 -0500 (0:00:00.046) 0:01:06.759 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:41:14 -0500 (0:00:00.046) 0:01:06.806 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:41:14 -0500 (0:00:00.045) 0:01:06.851 ******** ok: [managed-node1] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:41:14 -0500 (0:00:00.040) 0:01:06.892 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:41:14 -0500 (0:00:00.039) 0:01:06.932 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:41:14 -0500 (0:00:00.039) 0:01:06.972 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:41:14 -0500 (0:00:00.037) 0:01:07.009 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:41:15 -0500 (0:00:00.036) 0:01:07.045 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:41:15 -0500 (0:00:00.053) 0:01:07.098 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:41:15 -0500 (0:00:00.055) 0:01:07.154 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:41:15 -0500 (0:00:00.048) 0:01:07.203 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:41:15 -0500 (0:00:00.044) 0:01:07.247 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:41:15 -0500 (0:00:00.046) 0:01:07.293 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:41:15 -0500 (0:00:00.041) 0:01:07.335 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 January 2025 04:41:15 -0500 (0:00:00.039) 0:01:07.374 ******** changed: [managed-node1] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:118 Friday 17 January 2025 04:41:15 -0500 (0:00:00.581) 0:01:07.955 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node1 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 January 2025 04:41:16 -0500 (0:00:00.075) 0:01:08.030 ******** ok: [managed-node1] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 January 2025 04:41:16 -0500 (0:00:00.043) 0:01:08.074 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:41:16 -0500 (0:00:00.115) 0:01:08.189 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:41:16 -0500 (0:00:00.081) 0:01:08.270 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:41:16 -0500 (0:00:00.071) 0:01:08.342 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:41:16 -0500 (0:00:00.132) 0:01:08.475 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:41:16 -0500 (0:00:00.057) 0:01:08.533 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:41:16 -0500 (0:00:00.059) 0:01:08.593 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:41:16 -0500 (0:00:00.049) 0:01:08.642 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:41:16 -0500 (0:00:00.072) 0:01:08.714 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:41:16 -0500 (0:00:00.133) 0:01:08.848 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:41:20 -0500 (0:00:03.958) 0:01:12.806 ******** ok: [managed-node1] => { "storage_pools": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:41:20 -0500 (0:00:00.070) 0:01:12.877 ******** ok: [managed-node1] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:41:20 -0500 (0:00:00.067) 0:01:12.944 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:41:24 -0500 (0:00:03.892) 0:01:16.837 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:41:24 -0500 (0:00:00.180) 0:01:17.018 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:41:25 -0500 (0:00:00.090) 0:01:17.108 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:41:25 -0500 (0:00:00.104) 0:01:17.212 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:41:25 -0500 (0:00:00.125) 0:01:17.337 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:41:26 -0500 (0:00:01.022) 0:01:18.360 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:41:27 -0500 (0:00:01.140) 0:01:19.500 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:41:27 -0500 (0:00:00.082) 0:01:19.583 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:41:27 -0500 (0:00:00.050) 0:01:19.633 ******** fatal: [managed-node1]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Friday 17 January 2025 04:41:31 -0500 (0:00:04.324) 0:01:23.958 ******** fatal: [managed-node1]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10733223936, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'fs_overwrite_existing': True, u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'cache_size': 0, u'mount_user': None, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'mount_group': None, u'type': u'disk', u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8' in safe mode due to encryption removal"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:41:32 -0500 (0:00:00.087) 0:01:24.046 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 17 January 2025 04:41:32 -0500 (0:00:00.071) 0:01:24.118 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 17 January 2025 04:41:32 -0500 (0:00:00.076) 0:01:24.194 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 17 January 2025 04:41:32 -0500 (0:00:00.096) 0:01:24.291 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 January 2025 04:41:32 -0500 (0:00:00.071) 0:01:24.362 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737106875.8695776, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1737106875.8695776, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1737106875.8695776, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744073666735769", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 January 2025 04:41:32 -0500 (0:00:00.553) 0:01:24.916 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:138 Friday 17 January 2025 04:41:32 -0500 (0:00:00.059) 0:01:24.975 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:41:33 -0500 (0:00:00.267) 0:01:25.243 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:41:33 -0500 (0:00:00.109) 0:01:25.352 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:41:33 -0500 (0:00:00.090) 0:01:25.442 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:41:33 -0500 (0:00:00.180) 0:01:25.623 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:41:33 -0500 (0:00:00.055) 0:01:25.678 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:41:33 -0500 (0:00:00.067) 0:01:25.745 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:41:33 -0500 (0:00:00.062) 0:01:25.808 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:41:33 -0500 (0:00:00.066) 0:01:25.874 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:41:34 -0500 (0:00:00.148) 0:01:26.023 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:41:35 -0500 (0:00:01.984) 0:01:28.007 ******** ok: [managed-node1] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:41:36 -0500 (0:00:00.064) 0:01:28.072 ******** ok: [managed-node1] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:41:36 -0500 (0:00:00.070) 0:01:28.143 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:41:39 -0500 (0:00:03.843) 0:01:31.987 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:41:40 -0500 (0:00:00.102) 0:01:32.089 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:41:40 -0500 (0:00:00.068) 0:01:32.157 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:41:40 -0500 (0:00:00.068) 0:01:32.225 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:41:40 -0500 (0:00:00.056) 0:01:32.282 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:41:41 -0500 (0:00:00.862) 0:01:33.144 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:41:42 -0500 (0:00:01.140) 0:01:34.284 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:41:42 -0500 (0:00:00.118) 0:01:34.403 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:41:42 -0500 (0:00:00.054) 0:01:34.457 ******** changed: [managed-node1] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=b54980d6-c1a2-4bb9-8319-0097beaa5efa", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=b54980d6-c1a2-4bb9-8319-0097beaa5efa", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:41:47 -0500 (0:00:04.627) 0:01:39.084 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:41:47 -0500 (0:00:00.053) 0:01:39.138 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737106864.1195564, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "8b4c0b185bb4d56fce10bf34f2ff7c7dd9fe8355", "ctime": 1737106864.1175566, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737106864.1175566, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:41:47 -0500 (0:00:00.418) 0:01:39.557 ******** ok: [managed-node1] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:41:47 -0500 (0:00:00.428) 0:01:39.985 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:41:48 -0500 (0:00:00.050) 0:01:40.035 ******** ok: [managed-node1] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=b54980d6-c1a2-4bb9-8319-0097beaa5efa", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=b54980d6-c1a2-4bb9-8319-0097beaa5efa", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:41:48 -0500 (0:00:00.107) 0:01:40.143 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:41:48 -0500 (0:00:00.064) 0:01:40.207 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=b54980d6-c1a2-4bb9-8319-0097beaa5efa", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:41:48 -0500 (0:00:00.097) 0:01:40.306 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:41:48 -0500 (0:00:00.501) 0:01:40.807 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:41:49 -0500 (0:00:00.595) 0:01:41.403 ******** changed: [managed-node1] => (item={u'src': u'UUID=b54980d6-c1a2-4bb9-8319-0097beaa5efa', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=b54980d6-c1a2-4bb9-8319-0097beaa5efa", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=b54980d6-c1a2-4bb9-8319-0097beaa5efa" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:41:49 -0500 (0:00:00.541) 0:01:41.944 ******** skipping: [managed-node1] => (item={u'src': u'UUID=b54980d6-c1a2-4bb9-8319-0097beaa5efa', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=b54980d6-c1a2-4bb9-8319-0097beaa5efa", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:41:50 -0500 (0:00:00.137) 0:01:42.082 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:41:50 -0500 (0:00:00.607) 0:01:42.689 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737106868.7325647, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "5d04515aa82b76197f4189fa7bfe08219e0174bb", "ctime": 1737106865.8205595, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263661, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1737106865.8195593, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "18446744072031199073", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:41:51 -0500 (0:00:00.455) 0:01:43.144 ******** changed: [managed-node1] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:41:51 -0500 (0:00:00.526) 0:01:43.671 ******** ok: [managed-node1] TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:151 Friday 17 January 2025 04:41:52 -0500 (0:00:00.835) 0:01:44.507 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:41:52 -0500 (0:00:00.144) 0:01:44.652 ******** skipping: [managed-node1] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:41:52 -0500 (0:00:00.063) 0:01:44.715 ******** ok: [managed-node1] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=b54980d6-c1a2-4bb9-8319-0097beaa5efa", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:41:52 -0500 (0:00:00.100) 0:01:44.816 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "b54980d6-c1a2-4bb9-8319-0097beaa5efa" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:41:53 -0500 (0:00:00.442) 0:01:45.259 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003097", "end": "2025-01-17 04:41:53.641678", "rc": 0, "start": "2025-01-17 04:41:53.638581" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=b54980d6-c1a2-4bb9-8319-0097beaa5efa /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:41:53 -0500 (0:00:00.487) 0:01:45.747 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002888", "end": "2025-01-17 04:41:54.092925", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:41:54.090037" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:41:54 -0500 (0:00:00.456) 0:01:46.203 ******** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:41:54 -0500 (0:00:00.053) 0:01:46.257 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:41:54 -0500 (0:00:00.155) 0:01:46.413 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:41:54 -0500 (0:00:00.069) 0:01:46.482 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:41:54 -0500 (0:00:00.260) 0:01:46.743 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:41:54 -0500 (0:00:00.061) 0:01:46.804 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:41:54 -0500 (0:00:00.070) 0:01:46.875 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:41:54 -0500 (0:00:00.059) 0:01:46.934 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:41:54 -0500 (0:00:00.062) 0:01:46.997 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:41:55 -0500 (0:00:00.044) 0:01:47.041 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:41:55 -0500 (0:00:00.045) 0:01:47.087 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:41:55 -0500 (0:00:00.044) 0:01:47.132 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:41:55 -0500 (0:00:00.042) 0:01:47.174 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:41:55 -0500 (0:00:00.037) 0:01:47.212 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:41:55 -0500 (0:00:00.037) 0:01:47.249 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:41:55 -0500 (0:00:00.043) 0:01:47.294 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=b54980d6-c1a2-4bb9-8319-0097beaa5efa " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:41:55 -0500 (0:00:00.087) 0:01:47.381 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:41:55 -0500 (0:00:00.057) 0:01:47.439 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:41:55 -0500 (0:00:00.060) 0:01:47.499 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:41:55 -0500 (0:00:00.052) 0:01:47.552 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:41:55 -0500 (0:00:00.043) 0:01:47.596 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:41:55 -0500 (0:00:00.037) 0:01:47.633 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:41:55 -0500 (0:00:00.059) 0:01:47.692 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:41:55 -0500 (0:00:00.076) 0:01:47.769 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737106906.920634, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737106906.920634, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 28267, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1737106906.920634, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:41:56 -0500 (0:00:00.396) 0:01:48.166 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:41:56 -0500 (0:00:00.071) 0:01:48.237 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:41:56 -0500 (0:00:00.061) 0:01:48.299 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:41:56 -0500 (0:00:00.060) 0:01:48.360 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:41:56 -0500 (0:00:00.046) 0:01:48.406 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:41:56 -0500 (0:00:00.049) 0:01:48.456 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:41:56 -0500 (0:00:00.049) 0:01:48.506 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:41:56 -0500 (0:00:00.044) 0:01:48.550 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:41:57 -0500 (0:00:00.628) 0:01:49.179 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:41:57 -0500 (0:00:00.036) 0:01:49.215 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:41:57 -0500 (0:00:00.037) 0:01:49.253 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:41:57 -0500 (0:00:00.051) 0:01:49.305 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:41:57 -0500 (0:00:00.036) 0:01:49.341 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:41:57 -0500 (0:00:00.036) 0:01:49.378 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:41:57 -0500 (0:00:00.036) 0:01:49.414 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:41:57 -0500 (0:00:00.036) 0:01:49.451 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:41:57 -0500 (0:00:00.038) 0:01:49.489 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:41:57 -0500 (0:00:00.045) 0:01:49.535 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:41:57 -0500 (0:00:00.049) 0:01:49.585 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:41:57 -0500 (0:00:00.044) 0:01:49.630 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:41:57 -0500 (0:00:00.036) 0:01:49.666 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:41:57 -0500 (0:00:00.036) 0:01:49.702 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:41:57 -0500 (0:00:00.039) 0:01:49.742 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:41:57 -0500 (0:00:00.036) 0:01:49.778 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:41:57 -0500 (0:00:00.036) 0:01:49.815 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:41:57 -0500 (0:00:00.035) 0:01:49.851 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:41:57 -0500 (0:00:00.035) 0:01:49.887 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:41:57 -0500 (0:00:00.037) 0:01:49.925 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:41:57 -0500 (0:00:00.039) 0:01:49.965 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:41:57 -0500 (0:00:00.035) 0:01:50.000 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:41:58 -0500 (0:00:00.038) 0:01:50.039 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:41:58 -0500 (0:00:00.037) 0:01:50.076 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:41:58 -0500 (0:00:00.037) 0:01:50.114 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:41:58 -0500 (0:00:00.042) 0:01:50.157 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:41:58 -0500 (0:00:00.052) 0:01:50.209 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:41:58 -0500 (0:00:00.056) 0:01:50.266 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:41:58 -0500 (0:00:00.056) 0:01:50.322 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:41:58 -0500 (0:00:00.050) 0:01:50.372 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:41:58 -0500 (0:00:00.047) 0:01:50.420 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:41:58 -0500 (0:00:00.046) 0:01:50.467 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:41:58 -0500 (0:00:00.046) 0:01:50.513 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:41:58 -0500 (0:00:00.038) 0:01:50.552 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:41:58 -0500 (0:00:00.036) 0:01:50.589 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:41:58 -0500 (0:00:00.049) 0:01:50.638 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:41:58 -0500 (0:00:00.062) 0:01:50.700 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:41:58 -0500 (0:00:00.055) 0:01:50.756 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:41:58 -0500 (0:00:00.046) 0:01:50.802 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:41:58 -0500 (0:00:00.049) 0:01:50.852 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:41:58 -0500 (0:00:00.046) 0:01:50.898 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:41:58 -0500 (0:00:00.037) 0:01:50.936 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:41:58 -0500 (0:00:00.036) 0:01:50.973 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:41:58 -0500 (0:00:00.035) 0:01:51.009 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:41:59 -0500 (0:00:00.049) 0:01:51.058 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:41:59 -0500 (0:00:00.055) 0:01:51.113 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:41:59 -0500 (0:00:00.055) 0:01:51.169 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:41:59 -0500 (0:00:00.054) 0:01:51.223 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:41:59 -0500 (0:00:00.055) 0:01:51.279 ******** ok: [managed-node1] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:41:59 -0500 (0:00:00.064) 0:01:51.344 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:41:59 -0500 (0:00:00.065) 0:01:51.410 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:41:59 -0500 (0:00:00.061) 0:01:51.471 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:41:59 -0500 (0:00:00.056) 0:01:51.527 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:41:59 -0500 (0:00:00.057) 0:01:51.585 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:41:59 -0500 (0:00:00.056) 0:01:51.641 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:41:59 -0500 (0:00:00.067) 0:01:51.708 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:41:59 -0500 (0:00:00.051) 0:01:51.760 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:41:59 -0500 (0:00:00.045) 0:01:51.805 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:41:59 -0500 (0:00:00.044) 0:01:51.850 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:41:59 -0500 (0:00:00.043) 0:01:51.893 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 January 2025 04:41:59 -0500 (0:00:00.041) 0:01:51.935 ******** changed: [managed-node1] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:157 Friday 17 January 2025 04:42:00 -0500 (0:00:00.370) 0:01:52.306 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node1 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 January 2025 04:42:00 -0500 (0:00:00.090) 0:01:52.397 ******** ok: [managed-node1] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 January 2025 04:42:00 -0500 (0:00:00.048) 0:01:52.445 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:42:00 -0500 (0:00:00.055) 0:01:52.501 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:42:00 -0500 (0:00:00.059) 0:01:52.560 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:42:00 -0500 (0:00:00.044) 0:01:52.605 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:42:00 -0500 (0:00:00.092) 0:01:52.697 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:42:00 -0500 (0:00:00.036) 0:01:52.734 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:42:00 -0500 (0:00:00.055) 0:01:52.789 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:42:00 -0500 (0:00:00.047) 0:01:52.837 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:42:00 -0500 (0:00:00.055) 0:01:52.892 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:42:01 -0500 (0:00:00.134) 0:01:53.027 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:42:07 -0500 (0:00:06.059) 0:01:59.087 ******** ok: [managed-node1] => { "storage_pools": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:42:07 -0500 (0:00:00.124) 0:01:59.212 ******** ok: [managed-node1] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:42:07 -0500 (0:00:00.117) 0:01:59.329 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:42:11 -0500 (0:00:04.076) 0:02:03.406 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:42:11 -0500 (0:00:00.161) 0:02:03.567 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:42:11 -0500 (0:00:00.085) 0:02:03.653 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:42:11 -0500 (0:00:00.063) 0:02:03.716 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:42:11 -0500 (0:00:00.066) 0:02:03.783 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:42:12 -0500 (0:00:01.095) 0:02:04.878 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d50ef5d15\\x2dccb7\\x2d43d2\\x2d8b00\\x2dfd1d2599bcf8.service": { "name": "systemd-cryptsetup@luks\\x2d50ef5d15\\x2dccb7\\x2d43d2\\x2d8b00\\x2dfd1d2599bcf8.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:42:13 -0500 (0:00:01.064) 0:02:05.942 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d50ef5d15\\x2dccb7\\x2d43d2\\x2d8b00\\x2dfd1d2599bcf8.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:42:14 -0500 (0:00:00.081) 0:02:06.024 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2d50ef5d15\x2dccb7\x2d43d2\x2d8b00\x2dfd1d2599bcf8.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d50ef5d15\\x2dccb7\\x2d43d2\\x2d8b00\\x2dfd1d2599bcf8.service", "name": "systemd-cryptsetup@luks\\x2d50ef5d15\\x2dccb7\\x2d43d2\\x2d8b00\\x2dfd1d2599bcf8.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda.device system-systemd\\x2dcryptsetup.slice systemd-readahead-replay.service systemd-readahead-collect.service cryptsetup-pre.target systemd-journald.socket", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-50ef5d15-ccb7-43d2-8b00-fd1d2599bcf8 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d50ef5d15\\x2dccb7\\x2d43d2\\x2d8b00\\x2dfd1d2599bcf8.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d50ef5d15\\x2dccb7\\x2d43d2\\x2d8b00\\x2dfd1d2599bcf8.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d50ef5d15\\x2dccb7\\x2d43d2\\x2d8b00\\x2dfd1d2599bcf8.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:42:14 -0500 (0:00:00.625) 0:02:06.649 ******** fatal: [managed-node1]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Friday 17 January 2025 04:42:18 -0500 (0:00:04.139) 0:02:10.789 ******** fatal: [managed-node1]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': True, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'mount_user': None, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'mount_group': None, u'type': u'disk', u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'sda' in safe mode due to adding encryption"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:42:18 -0500 (0:00:00.067) 0:02:10.856 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2d50ef5d15\x2dccb7\x2d43d2\x2d8b00\x2dfd1d2599bcf8.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d50ef5d15\\x2dccb7\\x2d43d2\\x2d8b00\\x2dfd1d2599bcf8.service", "name": "systemd-cryptsetup@luks\\x2d50ef5d15\\x2dccb7\\x2d43d2\\x2d8b00\\x2dfd1d2599bcf8.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d50ef5d15\\x2dccb7\\x2d43d2\\x2d8b00\\x2dfd1d2599bcf8.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d50ef5d15\\x2dccb7\\x2d43d2\\x2d8b00\\x2dfd1d2599bcf8.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d50ef5d15\\x2dccb7\\x2d43d2\\x2d8b00\\x2dfd1d2599bcf8.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 17 January 2025 04:42:19 -0500 (0:00:00.593) 0:02:11.450 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 17 January 2025 04:42:19 -0500 (0:00:00.062) 0:02:11.512 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 17 January 2025 04:42:19 -0500 (0:00:00.077) 0:02:11.590 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 January 2025 04:42:19 -0500 (0:00:00.054) 0:02:11.644 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737106920.1966581, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1737106920.1966581, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1737106920.1966581, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1850724056", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 January 2025 04:42:20 -0500 (0:00:00.417) 0:02:12.062 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:177 Friday 17 January 2025 04:42:20 -0500 (0:00:00.061) 0:02:12.124 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:42:20 -0500 (0:00:00.136) 0:02:12.261 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:42:20 -0500 (0:00:00.062) 0:02:12.324 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:42:20 -0500 (0:00:00.060) 0:02:12.384 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:42:20 -0500 (0:00:00.122) 0:02:12.507 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:42:20 -0500 (0:00:00.059) 0:02:12.567 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:42:20 -0500 (0:00:00.056) 0:02:12.623 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:42:20 -0500 (0:00:00.056) 0:02:12.679 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:42:20 -0500 (0:00:00.056) 0:02:12.735 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:42:20 -0500 (0:00:00.127) 0:02:12.864 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:42:24 -0500 (0:00:03.953) 0:02:16.817 ******** ok: [managed-node1] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:42:24 -0500 (0:00:00.049) 0:02:16.867 ******** ok: [managed-node1] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:42:24 -0500 (0:00:00.064) 0:02:16.931 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:42:28 -0500 (0:00:03.973) 0:02:20.904 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:42:28 -0500 (0:00:00.113) 0:02:21.018 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:42:29 -0500 (0:00:00.058) 0:02:21.077 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:42:29 -0500 (0:00:00.092) 0:02:21.169 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:42:29 -0500 (0:00:00.097) 0:02:21.266 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:42:30 -0500 (0:00:00.924) 0:02:22.191 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:42:31 -0500 (0:00:01.144) 0:02:23.336 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:42:31 -0500 (0:00:00.071) 0:02:23.407 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:42:31 -0500 (0:00:00.046) 0:02:23.454 ******** changed: [managed-node1] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=b54980d6-c1a2-4bb9-8319-0097beaa5efa", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:42:41 -0500 (0:00:10.275) 0:02:33.729 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:42:41 -0500 (0:00:00.071) 0:02:33.800 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737106909.7796392, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "24643ec488fd1b9f9cb1da3fa8b4b79a0ee5311d", "ctime": 1737106909.7766392, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737106909.7766392, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1299, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:42:42 -0500 (0:00:00.509) 0:02:34.310 ******** ok: [managed-node1] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:42:42 -0500 (0:00:00.480) 0:02:34.790 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:42:42 -0500 (0:00:00.051) 0:02:34.841 ******** ok: [managed-node1] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=b54980d6-c1a2-4bb9-8319-0097beaa5efa", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:42:42 -0500 (0:00:00.083) 0:02:34.925 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:42:42 -0500 (0:00:00.079) 0:02:35.004 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:42:43 -0500 (0:00:00.069) 0:02:35.073 ******** changed: [managed-node1] => (item={u'src': u'UUID=b54980d6-c1a2-4bb9-8319-0097beaa5efa', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=b54980d6-c1a2-4bb9-8319-0097beaa5efa", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=b54980d6-c1a2-4bb9-8319-0097beaa5efa" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:42:43 -0500 (0:00:00.651) 0:02:35.724 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:42:44 -0500 (0:00:00.733) 0:02:36.457 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:42:44 -0500 (0:00:00.524) 0:02:36.981 ******** skipping: [managed-node1] => (item={u'src': u'/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:42:45 -0500 (0:00:00.096) 0:02:37.078 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:42:45 -0500 (0:00:00.550) 0:02:37.628 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737106914.092647, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1737106911.5566423, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263659, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1737106911.5556424, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744072031199233", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:42:46 -0500 (0:00:00.509) 0:02:38.138 ******** changed: [managed-node1] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-c8a01f23-c175-4f83-9e55-22d5b037641b', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:42:46 -0500 (0:00:00.467) 0:02:38.606 ******** ok: [managed-node1] TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:190 Friday 17 January 2025 04:42:47 -0500 (0:00:00.818) 0:02:39.424 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:42:47 -0500 (0:00:00.172) 0:02:39.597 ******** skipping: [managed-node1] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:42:47 -0500 (0:00:00.070) 0:02:39.667 ******** ok: [managed-node1] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:42:47 -0500 (0:00:00.082) 0:02:39.750 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "size": "10G", "type": "crypt", "uuid": "c66d4a80-43c3-455f-b8bf-298d21e38467" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "c8a01f23-c175-4f83-9e55-22d5b037641b" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:42:48 -0500 (0:00:00.544) 0:02:40.294 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002892", "end": "2025-01-17 04:42:48.809688", "rc": 0, "start": "2025-01-17 04:42:48.806796" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:42:48 -0500 (0:00:00.622) 0:02:40.916 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002814", "end": "2025-01-17 04:42:49.338634", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:42:49.335820" } STDOUT: luks-c8a01f23-c175-4f83-9e55-22d5b037641b /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:42:49 -0500 (0:00:00.525) 0:02:41.442 ******** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:42:49 -0500 (0:00:00.068) 0:02:41.510 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:42:49 -0500 (0:00:00.147) 0:02:41.657 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:42:49 -0500 (0:00:00.101) 0:02:41.759 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:42:50 -0500 (0:00:00.297) 0:02:42.056 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:42:50 -0500 (0:00:00.065) 0:02:42.122 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:42:50 -0500 (0:00:00.077) 0:02:42.199 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:42:50 -0500 (0:00:00.064) 0:02:42.263 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:42:50 -0500 (0:00:00.067) 0:02:42.331 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:42:50 -0500 (0:00:00.065) 0:02:42.397 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:42:50 -0500 (0:00:00.056) 0:02:42.454 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:42:50 -0500 (0:00:00.073) 0:02:42.527 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:42:50 -0500 (0:00:00.113) 0:02:42.641 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:42:50 -0500 (0:00:00.101) 0:02:42.742 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:42:50 -0500 (0:00:00.100) 0:02:42.842 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:42:50 -0500 (0:00:00.099) 0:02:42.942 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:42:51 -0500 (0:00:00.161) 0:02:43.104 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:42:51 -0500 (0:00:00.140) 0:02:43.244 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:42:51 -0500 (0:00:00.081) 0:02:43.326 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:42:51 -0500 (0:00:00.053) 0:02:43.380 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:42:51 -0500 (0:00:00.084) 0:02:43.464 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:42:51 -0500 (0:00:00.057) 0:02:43.522 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:42:51 -0500 (0:00:00.134) 0:02:43.657 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:42:51 -0500 (0:00:00.095) 0:02:43.752 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737106961.4757154, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737106961.4757154, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 28267, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1737106961.4757154, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:42:52 -0500 (0:00:00.694) 0:02:44.447 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:42:52 -0500 (0:00:00.093) 0:02:44.541 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:42:52 -0500 (0:00:00.084) 0:02:44.625 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:42:52 -0500 (0:00:00.085) 0:02:44.711 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:42:52 -0500 (0:00:00.063) 0:02:44.774 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:42:52 -0500 (0:00:00.063) 0:02:44.838 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:42:52 -0500 (0:00:00.067) 0:02:44.905 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737106961.5797157, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737106961.5797157, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 367274, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737106961.5797157, "nlink": 1, "path": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:42:53 -0500 (0:00:00.644) 0:02:45.550 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:42:54 -0500 (0:00:00.983) 0:02:46.533 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.024990", "end": "2025-01-17 04:42:55.034622", "rc": 0, "start": "2025-01-17 04:42:55.009632" } STDOUT: LUKS header information for /dev/sda Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 41 a4 bc 8f a8 62 d4 28 7f 2f 4d e8 3a 94 20 c5 61 df 68 bb MK salt: 22 ab 39 42 84 f9 73 3a c6 06 89 1e 25 be 49 fc 2d 44 8c 0b 4b 99 95 96 d5 4e 81 f9 48 39 59 8d MK iterations: 22978 UUID: c8a01f23-c175-4f83-9e55-22d5b037641b Key Slot 0: ENABLED Iterations: 367662 Salt: 9a 6c 1c 87 71 0a 6a 6b 5e b7 d0 49 bb d3 9f 4a 20 3a ae f8 44 35 6a c4 81 7d 38 31 d3 f3 11 44 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:42:55 -0500 (0:00:00.627) 0:02:47.161 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:42:55 -0500 (0:00:00.077) 0:02:47.238 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:42:55 -0500 (0:00:00.074) 0:02:47.313 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:42:55 -0500 (0:00:00.107) 0:02:47.420 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:42:55 -0500 (0:00:00.094) 0:02:47.515 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:42:55 -0500 (0:00:00.065) 0:02:47.581 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:42:55 -0500 (0:00:00.077) 0:02:47.659 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:42:55 -0500 (0:00:00.067) 0:02:47.726 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-c8a01f23-c175-4f83-9e55-22d5b037641b /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:42:55 -0500 (0:00:00.070) 0:02:47.797 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:42:55 -0500 (0:00:00.066) 0:02:47.863 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:42:55 -0500 (0:00:00.087) 0:02:47.951 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:42:56 -0500 (0:00:00.072) 0:02:48.023 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:42:56 -0500 (0:00:00.070) 0:02:48.094 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:42:56 -0500 (0:00:00.060) 0:02:48.155 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:42:56 -0500 (0:00:00.057) 0:02:48.212 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:42:56 -0500 (0:00:00.054) 0:02:48.267 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:42:56 -0500 (0:00:00.050) 0:02:48.318 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:42:56 -0500 (0:00:00.058) 0:02:48.376 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:42:56 -0500 (0:00:00.055) 0:02:48.432 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:42:56 -0500 (0:00:00.062) 0:02:48.495 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:42:56 -0500 (0:00:00.126) 0:02:48.621 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:42:56 -0500 (0:00:00.062) 0:02:48.683 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:42:56 -0500 (0:00:00.059) 0:02:48.743 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:42:56 -0500 (0:00:00.055) 0:02:48.798 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:42:56 -0500 (0:00:00.059) 0:02:48.858 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:42:56 -0500 (0:00:00.058) 0:02:48.916 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:42:56 -0500 (0:00:00.072) 0:02:48.989 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:42:57 -0500 (0:00:00.065) 0:02:49.054 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:42:57 -0500 (0:00:00.061) 0:02:49.116 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:42:57 -0500 (0:00:00.063) 0:02:49.180 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:42:57 -0500 (0:00:00.048) 0:02:49.228 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:42:57 -0500 (0:00:00.057) 0:02:49.286 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:42:57 -0500 (0:00:00.055) 0:02:49.341 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:42:57 -0500 (0:00:00.037) 0:02:49.379 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:42:57 -0500 (0:00:00.037) 0:02:49.417 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:42:57 -0500 (0:00:00.041) 0:02:49.459 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:42:57 -0500 (0:00:00.046) 0:02:49.505 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:42:57 -0500 (0:00:00.056) 0:02:49.562 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:42:57 -0500 (0:00:00.060) 0:02:49.622 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:42:57 -0500 (0:00:00.056) 0:02:49.678 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:42:57 -0500 (0:00:00.054) 0:02:49.733 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:42:57 -0500 (0:00:00.057) 0:02:49.790 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:42:57 -0500 (0:00:00.056) 0:02:49.847 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:42:57 -0500 (0:00:00.060) 0:02:49.907 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:42:57 -0500 (0:00:00.058) 0:02:49.966 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:42:58 -0500 (0:00:00.056) 0:02:50.022 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:42:58 -0500 (0:00:00.072) 0:02:50.095 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:42:58 -0500 (0:00:00.059) 0:02:50.154 ******** ok: [managed-node1] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:42:58 -0500 (0:00:00.054) 0:02:50.208 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:42:58 -0500 (0:00:00.046) 0:02:50.255 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:42:58 -0500 (0:00:00.057) 0:02:50.313 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:42:58 -0500 (0:00:00.051) 0:02:50.364 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:42:58 -0500 (0:00:00.038) 0:02:50.403 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:42:58 -0500 (0:00:00.037) 0:02:50.440 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:42:58 -0500 (0:00:00.038) 0:02:50.479 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:42:58 -0500 (0:00:00.037) 0:02:50.516 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:42:58 -0500 (0:00:00.039) 0:02:50.556 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:42:58 -0500 (0:00:00.036) 0:02:50.593 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:42:58 -0500 (0:00:00.037) 0:02:50.630 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:197 Friday 17 January 2025 04:42:58 -0500 (0:00:00.045) 0:02:50.676 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node1 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 January 2025 04:42:58 -0500 (0:00:00.134) 0:02:50.810 ******** ok: [managed-node1] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 January 2025 04:42:58 -0500 (0:00:00.067) 0:02:50.877 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:42:58 -0500 (0:00:00.088) 0:02:50.966 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:42:59 -0500 (0:00:00.100) 0:02:51.067 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:42:59 -0500 (0:00:00.067) 0:02:51.134 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:42:59 -0500 (0:00:00.153) 0:02:51.288 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:42:59 -0500 (0:00:00.053) 0:02:51.341 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:42:59 -0500 (0:00:00.054) 0:02:51.396 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:42:59 -0500 (0:00:00.056) 0:02:51.453 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:42:59 -0500 (0:00:00.059) 0:02:51.513 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:42:59 -0500 (0:00:00.130) 0:02:51.643 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:43:01 -0500 (0:00:02.272) 0:02:53.915 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:43:01 -0500 (0:00:00.091) 0:02:54.007 ******** ok: [managed-node1] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:43:02 -0500 (0:00:00.104) 0:02:54.111 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:43:06 -0500 (0:00:04.128) 0:02:58.240 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:43:06 -0500 (0:00:00.105) 0:02:58.345 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:43:06 -0500 (0:00:00.138) 0:02:58.484 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:43:06 -0500 (0:00:00.075) 0:02:58.559 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:43:06 -0500 (0:00:00.080) 0:02:58.639 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:43:07 -0500 (0:00:01.047) 0:02:59.687 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:43:08 -0500 (0:00:01.265) 0:03:00.952 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:43:09 -0500 (0:00:00.082) 0:03:01.035 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:43:09 -0500 (0:00:00.052) 0:03:01.088 ******** fatal: [managed-node1]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Friday 17 January 2025 04:43:13 -0500 (0:00:04.345) 0:03:05.434 ******** fatal: [managed-node1]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': False, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'partition', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'test1' missing key/password"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:43:13 -0500 (0:00:00.081) 0:03:05.515 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 17 January 2025 04:43:13 -0500 (0:00:00.048) 0:03:05.564 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 17 January 2025 04:43:13 -0500 (0:00:00.068) 0:03:05.633 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 17 January 2025 04:43:13 -0500 (0:00:00.069) 0:03:05.702 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:216 Friday 17 January 2025 04:43:13 -0500 (0:00:00.037) 0:03:05.740 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:43:13 -0500 (0:00:00.137) 0:03:05.878 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:43:13 -0500 (0:00:00.063) 0:03:05.941 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:43:13 -0500 (0:00:00.062) 0:03:06.004 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:43:14 -0500 (0:00:00.131) 0:03:06.135 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:43:14 -0500 (0:00:00.066) 0:03:06.202 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:43:14 -0500 (0:00:00.054) 0:03:06.257 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:43:14 -0500 (0:00:00.059) 0:03:06.316 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:43:14 -0500 (0:00:00.062) 0:03:06.379 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:43:14 -0500 (0:00:00.126) 0:03:06.506 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:43:18 -0500 (0:00:03.999) 0:03:10.505 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:43:18 -0500 (0:00:00.079) 0:03:10.585 ******** ok: [managed-node1] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:43:18 -0500 (0:00:00.070) 0:03:10.656 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:43:22 -0500 (0:00:04.208) 0:03:14.864 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:43:22 -0500 (0:00:00.076) 0:03:14.940 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:43:22 -0500 (0:00:00.046) 0:03:14.987 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:43:23 -0500 (0:00:00.058) 0:03:15.045 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:43:23 -0500 (0:00:00.059) 0:03:15.105 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:43:23 -0500 (0:00:00.739) 0:03:15.844 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:43:24 -0500 (0:00:00.998) 0:03:16.842 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:43:24 -0500 (0:00:00.084) 0:03:16.927 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:43:24 -0500 (0:00:00.066) 0:03:16.993 ******** changed: [managed-node1] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:43:35 -0500 (0:00:11.014) 0:03:28.008 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:43:36 -0500 (0:00:00.112) 0:03:28.120 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737106964.816719, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "9f060e22f62393b967f36795a28ce452a0e3fbdd", "ctime": 1737106964.813719, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737106964.813719, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:43:36 -0500 (0:00:00.561) 0:03:28.681 ******** ok: [managed-node1] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:43:37 -0500 (0:00:00.440) 0:03:29.122 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:43:37 -0500 (0:00:00.056) 0:03:29.178 ******** ok: [managed-node1] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:43:37 -0500 (0:00:00.125) 0:03:29.304 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:43:37 -0500 (0:00:00.103) 0:03:29.407 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:43:37 -0500 (0:00:00.132) 0:03:29.540 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-c8a01f23-c175-4f83-9e55-22d5b037641b" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:43:38 -0500 (0:00:00.479) 0:03:30.020 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:43:38 -0500 (0:00:00.587) 0:03:30.608 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:43:39 -0500 (0:00:00.443) 0:03:31.052 ******** skipping: [managed-node1] => (item={u'src': u'/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:43:39 -0500 (0:00:00.074) 0:03:31.126 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:43:39 -0500 (0:00:00.533) 0:03:31.660 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737106969.3367238, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e45a2c8842d3c2f1d95354f25b89ea3a2e26e24c", "ctime": 1737106966.4997208, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263661, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1737106966.4997208, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "18446744072031199392", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:43:40 -0500 (0:00:00.417) 0:03:32.077 ******** changed: [managed-node1] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-c8a01f23-c175-4f83-9e55-22d5b037641b', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node1] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:43:40 -0500 (0:00:00.783) 0:03:32.861 ******** ok: [managed-node1] TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:233 Friday 17 January 2025 04:43:41 -0500 (0:00:00.759) 0:03:33.620 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:43:41 -0500 (0:00:00.146) 0:03:33.767 ******** ok: [managed-node1] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:43:41 -0500 (0:00:00.087) 0:03:33.854 ******** skipping: [managed-node1] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:43:41 -0500 (0:00:00.056) 0:03:33.911 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "size": "10G", "type": "crypt", "uuid": "93dbf6ae-92b2-4db2-a7f8-aad07c2c36ee" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:43:43 -0500 (0:00:01.638) 0:03:35.550 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002826", "end": "2025-01-17 04:43:43.862003", "rc": 0, "start": "2025-01-17 04:43:43.859177" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:43:43 -0500 (0:00:00.423) 0:03:35.974 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002825", "end": "2025-01-17 04:43:44.366893", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:43:44.364068" } STDOUT: luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6 /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:43:44 -0500 (0:00:00.511) 0:03:36.486 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node1 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 January 2025 04:43:44 -0500 (0:00:00.128) 0:03:36.614 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 January 2025 04:43:44 -0500 (0:00:00.060) 0:03:36.675 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 January 2025 04:43:44 -0500 (0:00:00.048) 0:03:36.724 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 January 2025 04:43:44 -0500 (0:00:00.054) 0:03:36.778 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 January 2025 04:43:44 -0500 (0:00:00.116) 0:03:36.894 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 January 2025 04:43:44 -0500 (0:00:00.047) 0:03:36.942 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 January 2025 04:43:44 -0500 (0:00:00.047) 0:03:36.990 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 January 2025 04:43:45 -0500 (0:00:00.037) 0:03:37.028 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 January 2025 04:43:45 -0500 (0:00:00.044) 0:03:37.072 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 January 2025 04:43:45 -0500 (0:00:00.050) 0:03:37.122 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 January 2025 04:43:45 -0500 (0:00:00.060) 0:03:37.183 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 January 2025 04:43:45 -0500 (0:00:00.063) 0:03:37.246 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Friday 17 January 2025 04:43:45 -0500 (0:00:00.058) 0:03:37.304 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Friday 17 January 2025 04:43:45 -0500 (0:00:00.085) 0:03:37.390 ******** ok: [managed-node1] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.46.65 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Friday 17 January 2025 04:43:45 -0500 (0:00:00.368) 0:03:37.759 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Friday 17 January 2025 04:43:45 -0500 (0:00:00.056) 0:03:37.816 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node1 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 January 2025 04:43:45 -0500 (0:00:00.128) 0:03:37.945 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 January 2025 04:43:45 -0500 (0:00:00.064) 0:03:38.009 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 January 2025 04:43:46 -0500 (0:00:00.060) 0:03:38.070 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 January 2025 04:43:46 -0500 (0:00:00.061) 0:03:38.132 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 January 2025 04:43:46 -0500 (0:00:00.059) 0:03:38.191 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 January 2025 04:43:46 -0500 (0:00:00.069) 0:03:38.261 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 January 2025 04:43:46 -0500 (0:00:00.148) 0:03:38.410 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 January 2025 04:43:46 -0500 (0:00:00.080) 0:03:38.490 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 January 2025 04:43:46 -0500 (0:00:00.057) 0:03:38.547 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 January 2025 04:43:46 -0500 (0:00:00.072) 0:03:38.620 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 January 2025 04:43:46 -0500 (0:00:00.061) 0:03:38.681 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Friday 17 January 2025 04:43:46 -0500 (0:00:00.078) 0:03:38.760 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node1 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 January 2025 04:43:46 -0500 (0:00:00.154) 0:03:38.914 ******** skipping: [managed-node1] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Friday 17 January 2025 04:43:46 -0500 (0:00:00.081) 0:03:38.996 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node1 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 January 2025 04:43:47 -0500 (0:00:00.174) 0:03:39.170 ******** skipping: [managed-node1] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Friday 17 January 2025 04:43:47 -0500 (0:00:00.141) 0:03:39.312 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 January 2025 04:43:47 -0500 (0:00:00.166) 0:03:39.478 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 January 2025 04:43:47 -0500 (0:00:00.073) 0:03:39.552 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 January 2025 04:43:47 -0500 (0:00:00.069) 0:03:39.621 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 January 2025 04:43:47 -0500 (0:00:00.063) 0:03:39.684 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Friday 17 January 2025 04:43:47 -0500 (0:00:00.067) 0:03:39.752 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node1 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 January 2025 04:43:47 -0500 (0:00:00.156) 0:03:39.909 ******** skipping: [managed-node1] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Friday 17 January 2025 04:43:47 -0500 (0:00:00.089) 0:03:39.998 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node1 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 January 2025 04:43:48 -0500 (0:00:00.241) 0:03:40.240 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Friday 17 January 2025 04:43:48 -0500 (0:00:00.059) 0:03:40.300 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 January 2025 04:43:48 -0500 (0:00:00.062) 0:03:40.362 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Friday 17 January 2025 04:43:48 -0500 (0:00:00.117) 0:03:40.480 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Friday 17 January 2025 04:43:48 -0500 (0:00:00.125) 0:03:40.606 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Friday 17 January 2025 04:43:48 -0500 (0:00:00.087) 0:03:40.694 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Friday 17 January 2025 04:43:48 -0500 (0:00:00.068) 0:03:40.762 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 January 2025 04:43:48 -0500 (0:00:00.072) 0:03:40.834 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:43:48 -0500 (0:00:00.150) 0:03:40.985 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:43:49 -0500 (0:00:00.070) 0:03:41.055 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:43:49 -0500 (0:00:00.282) 0:03:41.338 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:43:49 -0500 (0:00:00.084) 0:03:41.423 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:43:49 -0500 (0:00:00.100) 0:03:41.524 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:43:49 -0500 (0:00:00.092) 0:03:41.616 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:43:49 -0500 (0:00:00.087) 0:03:41.704 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:43:49 -0500 (0:00:00.095) 0:03:41.799 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:43:49 -0500 (0:00:00.122) 0:03:41.922 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:43:50 -0500 (0:00:00.107) 0:03:42.030 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:43:50 -0500 (0:00:00.068) 0:03:42.098 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:43:50 -0500 (0:00:00.075) 0:03:42.173 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:43:50 -0500 (0:00:00.088) 0:03:42.262 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:43:50 -0500 (0:00:00.061) 0:03:42.324 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:43:50 -0500 (0:00:00.095) 0:03:42.420 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:43:50 -0500 (0:00:00.068) 0:03:42.488 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:43:50 -0500 (0:00:00.056) 0:03:42.545 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:43:50 -0500 (0:00:00.058) 0:03:42.603 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:43:50 -0500 (0:00:00.087) 0:03:42.691 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:43:50 -0500 (0:00:00.050) 0:03:42.741 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:43:50 -0500 (0:00:00.077) 0:03:42.819 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:43:50 -0500 (0:00:00.087) 0:03:42.907 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107015.6937726, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737107015.6937726, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 376386, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1737107015.6937726, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:43:51 -0500 (0:00:00.400) 0:03:43.307 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:43:51 -0500 (0:00:00.072) 0:03:43.380 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:43:51 -0500 (0:00:00.057) 0:03:43.437 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:43:51 -0500 (0:00:00.073) 0:03:43.511 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:43:51 -0500 (0:00:00.064) 0:03:43.575 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:43:51 -0500 (0:00:00.057) 0:03:43.633 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:43:51 -0500 (0:00:00.067) 0:03:43.701 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107015.8047729, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737107015.8047729, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 376422, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737107015.8047729, "nlink": 1, "path": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:43:52 -0500 (0:00:00.433) 0:03:44.134 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:43:52 -0500 (0:00:00.715) 0:03:44.850 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.025604", "end": "2025-01-17 04:43:53.116771", "rc": 0, "start": "2025-01-17 04:43:53.091167" } STDOUT: LUKS header information for /dev/sda1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 09 9b 84 0e 70 a4 39 61 b3 c5 e5 c6 f5 0b 43 26 b5 f1 1d 21 MK salt: 79 76 f2 05 b8 b4 bc 12 39 4d 89 b5 5b a8 2e 95 68 eb cc 24 3a 78 10 12 ab dd c3 77 2b f8 c1 09 MK iterations: 23011 UUID: fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6 Key Slot 0: ENABLED Iterations: 368178 Salt: 36 44 33 15 78 a4 8e 99 11 b1 18 85 1a 11 7a aa 15 8e 25 2c df 3c c3 b0 54 d1 c5 5c 68 1b 77 34 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:43:53 -0500 (0:00:00.353) 0:03:45.204 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:43:53 -0500 (0:00:00.048) 0:03:45.253 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:43:53 -0500 (0:00:00.051) 0:03:45.304 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:43:53 -0500 (0:00:00.108) 0:03:45.412 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:43:53 -0500 (0:00:00.047) 0:03:45.460 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:43:53 -0500 (0:00:00.041) 0:03:45.502 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:43:53 -0500 (0:00:00.040) 0:03:45.542 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:43:53 -0500 (0:00:00.040) 0:03:45.583 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6 /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:43:53 -0500 (0:00:00.048) 0:03:45.631 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:43:53 -0500 (0:00:00.047) 0:03:45.679 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:43:53 -0500 (0:00:00.049) 0:03:45.728 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:43:53 -0500 (0:00:00.049) 0:03:45.778 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:43:53 -0500 (0:00:00.050) 0:03:45.828 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:43:53 -0500 (0:00:00.039) 0:03:45.867 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:43:53 -0500 (0:00:00.039) 0:03:45.907 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:43:53 -0500 (0:00:00.041) 0:03:45.948 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:43:53 -0500 (0:00:00.038) 0:03:45.987 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:43:54 -0500 (0:00:00.039) 0:03:46.026 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:43:54 -0500 (0:00:00.039) 0:03:46.066 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:43:54 -0500 (0:00:00.039) 0:03:46.105 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:43:54 -0500 (0:00:00.039) 0:03:46.144 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:43:54 -0500 (0:00:00.051) 0:03:46.196 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:43:54 -0500 (0:00:00.070) 0:03:46.267 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:43:54 -0500 (0:00:00.039) 0:03:46.307 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:43:54 -0500 (0:00:00.043) 0:03:46.351 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:43:54 -0500 (0:00:00.042) 0:03:46.393 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:43:54 -0500 (0:00:00.041) 0:03:46.434 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:43:54 -0500 (0:00:00.044) 0:03:46.479 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:43:54 -0500 (0:00:00.042) 0:03:46.521 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:43:54 -0500 (0:00:00.042) 0:03:46.563 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:43:54 -0500 (0:00:00.040) 0:03:46.604 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:43:54 -0500 (0:00:00.040) 0:03:46.645 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:43:54 -0500 (0:00:00.042) 0:03:46.687 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:43:54 -0500 (0:00:00.042) 0:03:46.729 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:43:54 -0500 (0:00:00.039) 0:03:46.768 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:43:54 -0500 (0:00:00.038) 0:03:46.807 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:43:54 -0500 (0:00:00.038) 0:03:46.845 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:43:54 -0500 (0:00:00.038) 0:03:46.883 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:43:54 -0500 (0:00:00.038) 0:03:46.921 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:43:54 -0500 (0:00:00.040) 0:03:46.962 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:43:54 -0500 (0:00:00.039) 0:03:47.002 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:43:55 -0500 (0:00:00.038) 0:03:47.041 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:43:55 -0500 (0:00:00.039) 0:03:47.080 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:43:55 -0500 (0:00:00.038) 0:03:47.119 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:43:55 -0500 (0:00:00.038) 0:03:47.157 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:43:55 -0500 (0:00:00.040) 0:03:47.198 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:43:55 -0500 (0:00:00.038) 0:03:47.237 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:43:55 -0500 (0:00:00.050) 0:03:47.288 ******** ok: [managed-node1] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:43:55 -0500 (0:00:00.044) 0:03:47.332 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:43:55 -0500 (0:00:00.041) 0:03:47.374 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:43:55 -0500 (0:00:00.041) 0:03:47.415 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:43:55 -0500 (0:00:00.047) 0:03:47.462 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:43:55 -0500 (0:00:00.055) 0:03:47.518 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:43:55 -0500 (0:00:00.061) 0:03:47.580 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:43:55 -0500 (0:00:00.053) 0:03:47.633 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:43:55 -0500 (0:00:00.050) 0:03:47.683 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:43:55 -0500 (0:00:00.048) 0:03:47.732 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:43:55 -0500 (0:00:00.054) 0:03:47.786 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:43:55 -0500 (0:00:00.038) 0:03:47.825 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:43:55 -0500 (0:00:00.034) 0:03:47.859 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 January 2025 04:43:55 -0500 (0:00:00.039) 0:03:47.899 ******** changed: [managed-node1] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:239 Friday 17 January 2025 04:43:56 -0500 (0:00:00.365) 0:03:48.265 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node1 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 January 2025 04:43:56 -0500 (0:00:00.147) 0:03:48.412 ******** ok: [managed-node1] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 January 2025 04:43:56 -0500 (0:00:00.065) 0:03:48.478 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:43:56 -0500 (0:00:00.089) 0:03:48.567 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:43:56 -0500 (0:00:00.089) 0:03:48.657 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:43:56 -0500 (0:00:00.135) 0:03:48.793 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:43:56 -0500 (0:00:00.106) 0:03:48.899 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:43:56 -0500 (0:00:00.037) 0:03:48.937 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:43:56 -0500 (0:00:00.041) 0:03:48.978 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:43:57 -0500 (0:00:00.051) 0:03:49.030 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:43:57 -0500 (0:00:00.056) 0:03:49.086 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:43:57 -0500 (0:00:00.120) 0:03:49.207 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:43:59 -0500 (0:00:01.933) 0:03:51.141 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:43:59 -0500 (0:00:00.049) 0:03:51.190 ******** ok: [managed-node1] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:43:59 -0500 (0:00:00.048) 0:03:51.239 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:44:03 -0500 (0:00:04.038) 0:03:55.278 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:44:03 -0500 (0:00:00.071) 0:03:55.349 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:44:03 -0500 (0:00:00.033) 0:03:55.383 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:44:03 -0500 (0:00:00.036) 0:03:55.420 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:44:03 -0500 (0:00:00.037) 0:03:55.457 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:44:04 -0500 (0:00:00.680) 0:03:56.137 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2dc8a01f23\\x2dc175\\x2d4f83\\x2d9e55\\x2d22d5b037641b.service": { "name": "systemd-cryptsetup@luks\\x2dc8a01f23\\x2dc175\\x2d4f83\\x2d9e55\\x2d22d5b037641b.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:44:05 -0500 (0:00:01.058) 0:03:57.196 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dc8a01f23\\x2dc175\\x2d4f83\\x2d9e55\\x2d22d5b037641b.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:44:05 -0500 (0:00:00.058) 0:03:57.254 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2dc8a01f23\x2dc175\x2d4f83\x2d9e55\x2d22d5b037641b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dc8a01f23\\x2dc175\\x2d4f83\\x2d9e55\\x2d22d5b037641b.service", "name": "systemd-cryptsetup@luks\\x2dc8a01f23\\x2dc175\\x2d4f83\\x2d9e55\\x2d22d5b037641b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket systemd-readahead-collect.service system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target dev-sda.device systemd-readahead-replay.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-c8a01f23-c175-4f83-9e55-22d5b037641b", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-c8a01f23-c175-4f83-9e55-22d5b037641b /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-c8a01f23-c175-4f83-9e55-22d5b037641b ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dc8a01f23\\x2dc175\\x2d4f83\\x2d9e55\\x2d22d5b037641b.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2dc8a01f23\\x2dc175\\x2d4f83\\x2d9e55\\x2d22d5b037641b.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2dc8a01f23\\x2dc175\\x2d4f83\\x2d9e55\\x2d22d5b037641b.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:44:05 -0500 (0:00:00.556) 0:03:57.811 ******** fatal: [managed-node1]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Friday 17 January 2025 04:44:10 -0500 (0:00:04.219) 0:04:02.031 ******** fatal: [managed-node1]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'partition', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6' in safe mode due to encryption removal"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:44:10 -0500 (0:00:00.054) 0:04:02.085 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2dc8a01f23\x2dc175\x2d4f83\x2d9e55\x2d22d5b037641b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dc8a01f23\\x2dc175\\x2d4f83\\x2d9e55\\x2d22d5b037641b.service", "name": "systemd-cryptsetup@luks\\x2dc8a01f23\\x2dc175\\x2d4f83\\x2d9e55\\x2d22d5b037641b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dc8a01f23\\x2dc175\\x2d4f83\\x2d9e55\\x2d22d5b037641b.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2dc8a01f23\\x2dc175\\x2d4f83\\x2d9e55\\x2d22d5b037641b.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2dc8a01f23\\x2dc175\\x2d4f83\\x2d9e55\\x2d22d5b037641b.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 17 January 2025 04:44:10 -0500 (0:00:00.618) 0:04:02.703 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 17 January 2025 04:44:10 -0500 (0:00:00.082) 0:04:02.786 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 17 January 2025 04:44:10 -0500 (0:00:00.091) 0:04:02.878 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 January 2025 04:44:10 -0500 (0:00:00.064) 0:04:02.942 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107036.1777942, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1737107036.1777942, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1737107036.1777942, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1216755601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 January 2025 04:44:11 -0500 (0:00:00.402) 0:04:03.345 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:263 Friday 17 January 2025 04:44:11 -0500 (0:00:00.059) 0:04:03.405 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:44:11 -0500 (0:00:00.310) 0:04:03.716 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:44:11 -0500 (0:00:00.149) 0:04:03.865 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:44:11 -0500 (0:00:00.080) 0:04:03.945 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:44:12 -0500 (0:00:00.160) 0:04:04.106 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:44:12 -0500 (0:00:00.059) 0:04:04.165 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:44:12 -0500 (0:00:00.056) 0:04:04.222 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:44:12 -0500 (0:00:00.056) 0:04:04.279 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:44:12 -0500 (0:00:00.056) 0:04:04.335 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:44:12 -0500 (0:00:00.128) 0:04:04.463 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:44:16 -0500 (0:00:03.898) 0:04:08.362 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:44:16 -0500 (0:00:00.045) 0:04:08.407 ******** ok: [managed-node1] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:44:16 -0500 (0:00:00.042) 0:04:08.450 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:44:20 -0500 (0:00:04.146) 0:04:12.596 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:44:20 -0500 (0:00:00.084) 0:04:12.681 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:44:20 -0500 (0:00:00.044) 0:04:12.725 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:44:20 -0500 (0:00:00.042) 0:04:12.767 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:44:20 -0500 (0:00:00.035) 0:04:12.803 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:44:21 -0500 (0:00:00.729) 0:04:13.532 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service": { "name": "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:44:22 -0500 (0:00:01.130) 0:04:14.663 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:44:22 -0500 (0:00:00.099) 0:04:14.762 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2dfdb0e013\x2d1d96\x2d4dda\x2d8e9e\x2d5dbd0c55ead6.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "name": "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda1.device cryptsetup-pre.target systemd-readahead-replay.service systemd-journald.socket system-systemd\\x2dcryptsetup.slice systemd-readahead-collect.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:44:23 -0500 (0:00:00.638) 0:04:15.401 ******** changed: [managed-node1] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:44:28 -0500 (0:00:04.675) 0:04:20.076 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:44:28 -0500 (0:00:00.058) 0:04:20.135 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107018.921776, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "060f75d25d2f1dd737e4d9bd3deca227001d0b5d", "ctime": 1737107018.918776, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737107018.918776, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:44:28 -0500 (0:00:00.415) 0:04:20.550 ******** ok: [managed-node1] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:44:28 -0500 (0:00:00.421) 0:04:20.971 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2dfdb0e013\x2d1d96\x2d4dda\x2d8e9e\x2d5dbd0c55ead6.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "name": "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.device", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:44:29 -0500 (0:00:00.525) 0:04:21.496 ******** ok: [managed-node1] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:44:29 -0500 (0:00:00.083) 0:04:21.580 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:44:29 -0500 (0:00:00.074) 0:04:21.655 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:44:29 -0500 (0:00:00.061) 0:04:21.716 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:44:30 -0500 (0:00:00.382) 0:04:22.099 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:44:30 -0500 (0:00:00.514) 0:04:22.613 ******** changed: [managed-node1] => (item={u'src': u'UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:44:31 -0500 (0:00:00.469) 0:04:23.083 ******** skipping: [managed-node1] => (item={u'src': u'UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:44:31 -0500 (0:00:00.079) 0:04:23.163 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:44:31 -0500 (0:00:00.534) 0:04:23.697 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107024.3657818, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "c9e7a11ea8b462db169592a1cb21aa0329e9e5d3", "ctime": 1737107020.741778, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263661, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1737107020.740778, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "18446744072031199547", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:44:32 -0500 (0:00:00.457) 0:04:24.155 ******** changed: [managed-node1] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:44:32 -0500 (0:00:00.643) 0:04:24.799 ******** ok: [managed-node1] TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:280 Friday 17 January 2025 04:44:33 -0500 (0:00:00.979) 0:04:25.778 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:44:33 -0500 (0:00:00.156) 0:04:25.934 ******** ok: [managed-node1] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:44:33 -0500 (0:00:00.074) 0:04:26.009 ******** skipping: [managed-node1] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:44:34 -0500 (0:00:00.078) 0:04:26.087 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "cf29b6a8-ce53-4e11-80b0-6e26a166e240" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:44:34 -0500 (0:00:00.546) 0:04:26.634 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002906", "end": "2025-01-17 04:44:34.977138", "rc": 0, "start": "2025-01-17 04:44:34.974232" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:44:35 -0500 (0:00:00.459) 0:04:27.093 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002824", "end": "2025-01-17 04:44:35.368727", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:44:35.365903" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:44:35 -0500 (0:00:00.376) 0:04:27.469 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node1 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 January 2025 04:44:35 -0500 (0:00:00.117) 0:04:27.587 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 January 2025 04:44:35 -0500 (0:00:00.132) 0:04:27.720 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 January 2025 04:44:35 -0500 (0:00:00.057) 0:04:27.777 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 January 2025 04:44:35 -0500 (0:00:00.055) 0:04:27.833 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 January 2025 04:44:35 -0500 (0:00:00.127) 0:04:27.961 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 January 2025 04:44:36 -0500 (0:00:00.061) 0:04:28.023 ******** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 January 2025 04:44:36 -0500 (0:00:00.051) 0:04:28.074 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 January 2025 04:44:36 -0500 (0:00:00.052) 0:04:28.126 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 January 2025 04:44:36 -0500 (0:00:00.057) 0:04:28.183 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 January 2025 04:44:36 -0500 (0:00:00.086) 0:04:28.269 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 January 2025 04:44:36 -0500 (0:00:00.055) 0:04:28.325 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 January 2025 04:44:36 -0500 (0:00:00.064) 0:04:28.390 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Friday 17 January 2025 04:44:36 -0500 (0:00:00.058) 0:04:28.448 ******** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Friday 17 January 2025 04:44:36 -0500 (0:00:00.052) 0:04:28.500 ******** ok: [managed-node1] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.46.65 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Friday 17 January 2025 04:44:36 -0500 (0:00:00.307) 0:04:28.808 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Friday 17 January 2025 04:44:36 -0500 (0:00:00.053) 0:04:28.861 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node1 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 January 2025 04:44:36 -0500 (0:00:00.121) 0:04:28.983 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 January 2025 04:44:37 -0500 (0:00:00.058) 0:04:29.041 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 January 2025 04:44:37 -0500 (0:00:00.055) 0:04:29.097 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 January 2025 04:44:37 -0500 (0:00:00.058) 0:04:29.156 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 January 2025 04:44:37 -0500 (0:00:00.061) 0:04:29.217 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 January 2025 04:44:37 -0500 (0:00:00.074) 0:04:29.292 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 January 2025 04:44:37 -0500 (0:00:00.076) 0:04:29.369 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 January 2025 04:44:37 -0500 (0:00:00.055) 0:04:29.424 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 January 2025 04:44:37 -0500 (0:00:00.059) 0:04:29.483 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 January 2025 04:44:37 -0500 (0:00:00.057) 0:04:29.540 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 January 2025 04:44:37 -0500 (0:00:00.059) 0:04:29.600 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Friday 17 January 2025 04:44:37 -0500 (0:00:00.056) 0:04:29.656 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node1 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 January 2025 04:44:37 -0500 (0:00:00.121) 0:04:29.778 ******** skipping: [managed-node1] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': 0, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Friday 17 January 2025 04:44:37 -0500 (0:00:00.079) 0:04:29.857 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node1 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 January 2025 04:44:37 -0500 (0:00:00.130) 0:04:29.987 ******** skipping: [managed-node1] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': 0, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Friday 17 January 2025 04:44:38 -0500 (0:00:00.074) 0:04:30.062 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 January 2025 04:44:38 -0500 (0:00:00.141) 0:04:30.203 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 January 2025 04:44:38 -0500 (0:00:00.071) 0:04:30.274 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 January 2025 04:44:38 -0500 (0:00:00.052) 0:04:30.327 ******** TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 January 2025 04:44:38 -0500 (0:00:00.072) 0:04:30.399 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Friday 17 January 2025 04:44:38 -0500 (0:00:00.058) 0:04:30.457 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node1 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 January 2025 04:44:38 -0500 (0:00:00.144) 0:04:30.602 ******** skipping: [managed-node1] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': 0, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Friday 17 January 2025 04:44:38 -0500 (0:00:00.099) 0:04:30.701 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node1 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 January 2025 04:44:38 -0500 (0:00:00.211) 0:04:30.912 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Friday 17 January 2025 04:44:38 -0500 (0:00:00.055) 0:04:30.967 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 January 2025 04:44:38 -0500 (0:00:00.048) 0:04:31.015 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Friday 17 January 2025 04:44:39 -0500 (0:00:00.038) 0:04:31.053 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Friday 17 January 2025 04:44:39 -0500 (0:00:00.037) 0:04:31.091 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Friday 17 January 2025 04:44:39 -0500 (0:00:00.037) 0:04:31.129 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Friday 17 January 2025 04:44:39 -0500 (0:00:00.047) 0:04:31.176 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 January 2025 04:44:39 -0500 (0:00:00.055) 0:04:31.231 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:44:39 -0500 (0:00:00.097) 0:04:31.329 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:44:39 -0500 (0:00:00.066) 0:04:31.395 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:44:39 -0500 (0:00:00.295) 0:04:31.691 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:44:39 -0500 (0:00:00.062) 0:04:31.754 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:44:39 -0500 (0:00:00.074) 0:04:31.829 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:44:39 -0500 (0:00:00.057) 0:04:31.886 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:44:39 -0500 (0:00:00.066) 0:04:31.952 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:44:39 -0500 (0:00:00.056) 0:04:32.009 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:44:40 -0500 (0:00:00.046) 0:04:32.055 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:44:40 -0500 (0:00:00.050) 0:04:32.106 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:44:40 -0500 (0:00:00.052) 0:04:32.158 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:44:40 -0500 (0:00:00.041) 0:04:32.200 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:44:40 -0500 (0:00:00.037) 0:04:32.238 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:44:40 -0500 (0:00:00.039) 0:04:32.278 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:44:40 -0500 (0:00:00.063) 0:04:32.342 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:44:40 -0500 (0:00:00.044) 0:04:32.387 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:44:40 -0500 (0:00:00.066) 0:04:32.453 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:44:40 -0500 (0:00:00.052) 0:04:32.506 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:44:40 -0500 (0:00:00.063) 0:04:32.569 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:44:40 -0500 (0:00:00.046) 0:04:32.616 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:44:40 -0500 (0:00:00.065) 0:04:32.682 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:44:40 -0500 (0:00:00.061) 0:04:32.743 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107067.9158278, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737107067.9158278, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 386862, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1737107067.9158278, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:44:41 -0500 (0:00:00.363) 0:04:33.107 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:44:41 -0500 (0:00:00.053) 0:04:33.161 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:44:41 -0500 (0:00:00.038) 0:04:33.199 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:44:41 -0500 (0:00:00.045) 0:04:33.245 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:44:41 -0500 (0:00:00.041) 0:04:33.287 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:44:41 -0500 (0:00:00.038) 0:04:33.325 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:44:41 -0500 (0:00:00.047) 0:04:33.373 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:44:41 -0500 (0:00:00.049) 0:04:33.422 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:44:42 -0500 (0:00:00.672) 0:04:34.094 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:44:42 -0500 (0:00:00.037) 0:04:34.132 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:44:42 -0500 (0:00:00.038) 0:04:34.170 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:44:42 -0500 (0:00:00.051) 0:04:34.222 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:44:42 -0500 (0:00:00.037) 0:04:34.260 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:44:42 -0500 (0:00:00.038) 0:04:34.298 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:44:42 -0500 (0:00:00.040) 0:04:34.338 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:44:42 -0500 (0:00:00.045) 0:04:34.384 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:44:42 -0500 (0:00:00.063) 0:04:34.448 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:44:42 -0500 (0:00:00.072) 0:04:34.520 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:44:42 -0500 (0:00:00.084) 0:04:34.605 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:44:42 -0500 (0:00:00.058) 0:04:34.663 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:44:42 -0500 (0:00:00.062) 0:04:34.726 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:44:42 -0500 (0:00:00.055) 0:04:34.782 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:44:42 -0500 (0:00:00.058) 0:04:34.841 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:44:42 -0500 (0:00:00.044) 0:04:34.885 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:44:42 -0500 (0:00:00.049) 0:04:34.935 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:44:42 -0500 (0:00:00.046) 0:04:34.981 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:44:43 -0500 (0:00:00.043) 0:04:35.025 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:44:43 -0500 (0:00:00.037) 0:04:35.063 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:44:43 -0500 (0:00:00.038) 0:04:35.101 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:44:43 -0500 (0:00:00.047) 0:04:35.149 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:44:43 -0500 (0:00:00.055) 0:04:35.204 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:44:43 -0500 (0:00:00.049) 0:04:35.253 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:44:43 -0500 (0:00:00.049) 0:04:35.303 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:44:43 -0500 (0:00:00.112) 0:04:35.416 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:44:43 -0500 (0:00:00.039) 0:04:35.455 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:44:43 -0500 (0:00:00.044) 0:04:35.500 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:44:43 -0500 (0:00:00.067) 0:04:35.567 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:44:43 -0500 (0:00:00.066) 0:04:35.634 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:44:43 -0500 (0:00:00.051) 0:04:35.685 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:44:43 -0500 (0:00:00.058) 0:04:35.744 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:44:43 -0500 (0:00:00.041) 0:04:35.785 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:44:43 -0500 (0:00:00.039) 0:04:35.825 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:44:43 -0500 (0:00:00.038) 0:04:35.863 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:44:43 -0500 (0:00:00.048) 0:04:35.912 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:44:43 -0500 (0:00:00.057) 0:04:35.969 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:44:44 -0500 (0:00:00.071) 0:04:36.041 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:44:44 -0500 (0:00:00.057) 0:04:36.099 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:44:44 -0500 (0:00:00.060) 0:04:36.160 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:44:44 -0500 (0:00:00.058) 0:04:36.219 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:44:44 -0500 (0:00:00.058) 0:04:36.278 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:44:44 -0500 (0:00:00.058) 0:04:36.336 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:44:44 -0500 (0:00:00.061) 0:04:36.398 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:44:44 -0500 (0:00:00.057) 0:04:36.455 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:44:44 -0500 (0:00:00.056) 0:04:36.512 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:44:44 -0500 (0:00:00.057) 0:04:36.570 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:44:44 -0500 (0:00:00.055) 0:04:36.625 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:44:44 -0500 (0:00:00.068) 0:04:36.694 ******** ok: [managed-node1] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:44:44 -0500 (0:00:00.069) 0:04:36.763 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:44:44 -0500 (0:00:00.058) 0:04:36.821 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:44:44 -0500 (0:00:00.060) 0:04:36.882 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:44:44 -0500 (0:00:00.058) 0:04:36.940 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:44:44 -0500 (0:00:00.056) 0:04:36.997 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:44:45 -0500 (0:00:00.054) 0:04:37.051 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:44:45 -0500 (0:00:00.065) 0:04:37.116 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:44:45 -0500 (0:00:00.061) 0:04:37.177 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:44:45 -0500 (0:00:00.056) 0:04:37.234 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:44:45 -0500 (0:00:00.057) 0:04:37.291 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:44:45 -0500 (0:00:00.057) 0:04:37.349 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:44:45 -0500 (0:00:00.050) 0:04:37.400 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 January 2025 04:44:45 -0500 (0:00:00.059) 0:04:37.459 ******** changed: [managed-node1] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:286 Friday 17 January 2025 04:44:45 -0500 (0:00:00.388) 0:04:37.848 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node1 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 January 2025 04:44:45 -0500 (0:00:00.134) 0:04:37.982 ******** ok: [managed-node1] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 January 2025 04:44:46 -0500 (0:00:00.068) 0:04:38.051 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:44:46 -0500 (0:00:00.096) 0:04:38.147 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:44:46 -0500 (0:00:00.089) 0:04:38.236 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:44:46 -0500 (0:00:00.066) 0:04:38.303 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:44:46 -0500 (0:00:00.131) 0:04:38.434 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:44:46 -0500 (0:00:00.056) 0:04:38.491 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:44:46 -0500 (0:00:00.060) 0:04:38.552 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:44:46 -0500 (0:00:00.054) 0:04:38.606 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:44:46 -0500 (0:00:00.049) 0:04:38.656 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:44:46 -0500 (0:00:00.112) 0:04:38.768 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:44:50 -0500 (0:00:03.861) 0:04:42.630 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:44:50 -0500 (0:00:00.048) 0:04:42.678 ******** ok: [managed-node1] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:44:50 -0500 (0:00:00.109) 0:04:42.788 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:44:54 -0500 (0:00:04.121) 0:04:46.909 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:44:54 -0500 (0:00:00.092) 0:04:47.001 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:44:55 -0500 (0:00:00.043) 0:04:47.045 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:44:55 -0500 (0:00:00.048) 0:04:47.094 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:44:55 -0500 (0:00:00.034) 0:04:47.129 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:44:55 -0500 (0:00:00.667) 0:04:47.797 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service": { "name": "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:44:56 -0500 (0:00:00.994) 0:04:48.791 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:44:56 -0500 (0:00:00.059) 0:04:48.850 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2dfdb0e013\x2d1d96\x2d4dda\x2d8e9e\x2d5dbd0c55ead6.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "name": "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-collect.service systemd-readahead-replay.service cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice dev-sda1.device systemd-journald.socket", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-fdb0e013-1d96-4dda-8e9e-5dbd0c55ead6 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:44:57 -0500 (0:00:00.523) 0:04:49.374 ******** fatal: [managed-node1]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Friday 17 January 2025 04:45:01 -0500 (0:00:04.027) 0:04:53.402 ******** fatal: [managed-node1]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'partition', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:45:01 -0500 (0:00:00.080) 0:04:53.482 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2dfdb0e013\x2d1d96\x2d4dda\x2d8e9e\x2d5dbd0c55ead6.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "name": "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2dfdb0e013\\x2d1d96\\x2d4dda\\x2d8e9e\\x2d5dbd0c55ead6.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 17 January 2025 04:45:02 -0500 (0:00:00.553) 0:04:54.035 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 17 January 2025 04:45:02 -0500 (0:00:00.065) 0:04:54.101 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 17 January 2025 04:45:02 -0500 (0:00:00.073) 0:04:54.174 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 January 2025 04:45:02 -0500 (0:00:00.058) 0:04:54.232 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107085.7608464, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1737107085.7608464, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1737107085.7608464, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744072654079018", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 January 2025 04:45:02 -0500 (0:00:00.429) 0:04:54.661 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:312 Friday 17 January 2025 04:45:02 -0500 (0:00:00.066) 0:04:54.728 ******** ok: [managed-node1] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testaZEGMVlukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:319 Friday 17 January 2025 04:45:03 -0500 (0:00:00.632) 0:04:55.360 ******** ok: [managed-node1] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testaZEGMVlukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1737107103.41-26670-235988288209941/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:326 Friday 17 January 2025 04:45:04 -0500 (0:00:00.984) 0:04:56.344 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:45:04 -0500 (0:00:00.113) 0:04:56.458 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:45:04 -0500 (0:00:00.086) 0:04:56.545 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:45:04 -0500 (0:00:00.067) 0:04:56.613 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:45:04 -0500 (0:00:00.130) 0:04:56.743 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:45:04 -0500 (0:00:00.058) 0:04:56.801 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:45:04 -0500 (0:00:00.048) 0:04:56.849 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:45:04 -0500 (0:00:00.045) 0:04:56.895 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:45:04 -0500 (0:00:00.047) 0:04:56.943 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:45:05 -0500 (0:00:00.187) 0:04:57.131 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:45:06 -0500 (0:00:01.157) 0:04:58.288 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testaZEGMVlukskey", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:45:06 -0500 (0:00:00.047) 0:04:58.336 ******** ok: [managed-node1] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:45:06 -0500 (0:00:00.041) 0:04:58.378 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:45:10 -0500 (0:00:03.924) 0:05:02.302 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:45:10 -0500 (0:00:00.098) 0:05:02.401 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:45:10 -0500 (0:00:00.050) 0:05:02.452 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:45:10 -0500 (0:00:00.054) 0:05:02.506 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:45:10 -0500 (0:00:00.046) 0:05:02.552 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:45:11 -0500 (0:00:00.758) 0:05:03.310 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:45:12 -0500 (0:00:00.981) 0:05:04.291 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:45:12 -0500 (0:00:00.077) 0:05:04.369 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:45:12 -0500 (0:00:00.047) 0:05:04.417 ******** changed: [managed-node1] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:45:23 -0500 (0:00:10.856) 0:05:15.274 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:45:23 -0500 (0:00:00.037) 0:05:15.311 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107070.953831, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "81c07b997dfe91ee8e474d4cdff8241117471f5b", "ctime": 1737107070.950831, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737107070.950831, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1299, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:45:23 -0500 (0:00:00.377) 0:05:15.688 ******** ok: [managed-node1] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:45:24 -0500 (0:00:00.359) 0:05:16.048 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:45:24 -0500 (0:00:00.036) 0:05:16.084 ******** ok: [managed-node1] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:45:24 -0500 (0:00:00.062) 0:05:16.147 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:45:24 -0500 (0:00:00.070) 0:05:16.217 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:45:24 -0500 (0:00:00.050) 0:05:16.268 ******** changed: [managed-node1] => (item={u'src': u'UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=cf29b6a8-ce53-4e11-80b0-6e26a166e240" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:45:24 -0500 (0:00:00.387) 0:05:16.655 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:45:25 -0500 (0:00:00.502) 0:05:17.158 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:45:25 -0500 (0:00:00.392) 0:05:17.550 ******** skipping: [managed-node1] => (item={u'src': u'/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:45:25 -0500 (0:00:00.060) 0:05:17.611 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:45:26 -0500 (0:00:00.479) 0:05:18.091 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107075.3668356, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1737107072.6128325, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263659, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1737107072.6118326, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744072031199732", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:45:26 -0500 (0:00:00.379) 0:05:18.470 ******** changed: [managed-node1] => (item={u'state': u'present', u'password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'name': u'luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:45:26 -0500 (0:00:00.406) 0:05:18.877 ******** ok: [managed-node1] TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:343 Friday 17 January 2025 04:45:27 -0500 (0:00:00.947) 0:05:19.825 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:45:27 -0500 (0:00:00.155) 0:05:19.981 ******** ok: [managed-node1] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:45:28 -0500 (0:00:00.127) 0:05:20.108 ******** skipping: [managed-node1] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:45:28 -0500 (0:00:00.113) 0:05:20.222 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "size": "10G", "type": "crypt", "uuid": "25cde8b5-3eb4-4e48-b3b2-5bef0ad182dd" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "fde34b0b-0d82-46ce-b76a-4930d8f4912d" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:45:29 -0500 (0:00:01.627) 0:05:21.849 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002920", "end": "2025-01-17 04:45:30.133632", "rc": 0, "start": "2025-01-17 04:45:30.130712" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:45:30 -0500 (0:00:00.391) 0:05:22.241 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003014", "end": "2025-01-17 04:45:30.535211", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:45:30.532197" } STDOUT: luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:45:30 -0500 (0:00:00.376) 0:05:22.617 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node1 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 January 2025 04:45:30 -0500 (0:00:00.083) 0:05:22.700 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 January 2025 04:45:30 -0500 (0:00:00.039) 0:05:22.740 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 January 2025 04:45:30 -0500 (0:00:00.040) 0:05:22.780 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 January 2025 04:45:30 -0500 (0:00:00.048) 0:05:22.828 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 January 2025 04:45:30 -0500 (0:00:00.125) 0:05:22.953 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 January 2025 04:45:30 -0500 (0:00:00.060) 0:05:23.014 ******** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 January 2025 04:45:31 -0500 (0:00:00.054) 0:05:23.069 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 January 2025 04:45:31 -0500 (0:00:00.059) 0:05:23.129 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 January 2025 04:45:31 -0500 (0:00:00.061) 0:05:23.191 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 January 2025 04:45:31 -0500 (0:00:00.057) 0:05:23.248 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 January 2025 04:45:31 -0500 (0:00:00.056) 0:05:23.304 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 January 2025 04:45:31 -0500 (0:00:00.059) 0:05:23.364 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Friday 17 January 2025 04:45:31 -0500 (0:00:00.065) 0:05:23.429 ******** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Friday 17 January 2025 04:45:31 -0500 (0:00:00.056) 0:05:23.486 ******** ok: [managed-node1] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.46.65 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Friday 17 January 2025 04:45:31 -0500 (0:00:00.310) 0:05:23.796 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Friday 17 January 2025 04:45:31 -0500 (0:00:00.055) 0:05:23.851 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node1 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 January 2025 04:45:31 -0500 (0:00:00.118) 0:05:23.970 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 January 2025 04:45:32 -0500 (0:00:00.061) 0:05:24.032 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 January 2025 04:45:32 -0500 (0:00:00.053) 0:05:24.085 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 January 2025 04:45:32 -0500 (0:00:00.059) 0:05:24.144 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 January 2025 04:45:32 -0500 (0:00:00.056) 0:05:24.200 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 January 2025 04:45:32 -0500 (0:00:00.056) 0:05:24.257 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 January 2025 04:45:32 -0500 (0:00:00.057) 0:05:24.315 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 January 2025 04:45:32 -0500 (0:00:00.056) 0:05:24.371 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 January 2025 04:45:32 -0500 (0:00:00.055) 0:05:24.426 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 January 2025 04:45:32 -0500 (0:00:00.060) 0:05:24.487 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 January 2025 04:45:32 -0500 (0:00:00.057) 0:05:24.544 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Friday 17 January 2025 04:45:32 -0500 (0:00:00.074) 0:05:24.618 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node1 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 January 2025 04:45:32 -0500 (0:00:00.131) 0:05:24.750 ******** skipping: [managed-node1] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Friday 17 January 2025 04:45:32 -0500 (0:00:00.078) 0:05:24.829 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node1 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 January 2025 04:45:32 -0500 (0:00:00.123) 0:05:24.952 ******** skipping: [managed-node1] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Friday 17 January 2025 04:45:33 -0500 (0:00:00.075) 0:05:25.028 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 January 2025 04:45:33 -0500 (0:00:00.140) 0:05:25.168 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 January 2025 04:45:33 -0500 (0:00:00.064) 0:05:25.232 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 January 2025 04:45:33 -0500 (0:00:00.057) 0:05:25.290 ******** TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 January 2025 04:45:33 -0500 (0:00:00.051) 0:05:25.342 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Friday 17 January 2025 04:45:33 -0500 (0:00:00.057) 0:05:25.399 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node1 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 January 2025 04:45:33 -0500 (0:00:00.213) 0:05:25.612 ******** skipping: [managed-node1] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Friday 17 January 2025 04:45:33 -0500 (0:00:00.083) 0:05:25.696 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node1 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 January 2025 04:45:33 -0500 (0:00:00.147) 0:05:25.843 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Friday 17 January 2025 04:45:33 -0500 (0:00:00.057) 0:05:25.901 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 January 2025 04:45:33 -0500 (0:00:00.056) 0:05:25.957 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Friday 17 January 2025 04:45:33 -0500 (0:00:00.053) 0:05:26.011 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Friday 17 January 2025 04:45:34 -0500 (0:00:00.058) 0:05:26.069 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Friday 17 January 2025 04:45:34 -0500 (0:00:00.057) 0:05:26.127 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Friday 17 January 2025 04:45:34 -0500 (0:00:00.060) 0:05:26.188 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 January 2025 04:45:34 -0500 (0:00:00.054) 0:05:26.243 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:45:34 -0500 (0:00:00.113) 0:05:26.356 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:45:34 -0500 (0:00:00.067) 0:05:26.424 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:45:34 -0500 (0:00:00.279) 0:05:26.703 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:45:34 -0500 (0:00:00.067) 0:05:26.770 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:45:34 -0500 (0:00:00.068) 0:05:26.838 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:45:34 -0500 (0:00:00.058) 0:05:26.897 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:45:34 -0500 (0:00:00.065) 0:05:26.962 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:45:34 -0500 (0:00:00.055) 0:05:27.018 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:45:35 -0500 (0:00:00.061) 0:05:27.080 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:45:35 -0500 (0:00:00.056) 0:05:27.136 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:45:35 -0500 (0:00:00.055) 0:05:27.192 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:45:35 -0500 (0:00:00.056) 0:05:27.249 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:45:35 -0500 (0:00:00.057) 0:05:27.306 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:45:35 -0500 (0:00:00.059) 0:05:27.366 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:45:35 -0500 (0:00:00.098) 0:05:27.464 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:45:35 -0500 (0:00:00.066) 0:05:27.530 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:45:35 -0500 (0:00:00.066) 0:05:27.597 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:45:35 -0500 (0:00:00.054) 0:05:27.652 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:45:35 -0500 (0:00:00.053) 0:05:27.705 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:45:35 -0500 (0:00:00.048) 0:05:27.753 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:45:35 -0500 (0:00:00.083) 0:05:27.836 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:45:35 -0500 (0:00:00.053) 0:05:27.890 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107123.0008857, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737107123.0008857, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 397098, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1737107123.0008857, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:45:36 -0500 (0:00:00.333) 0:05:28.224 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:45:36 -0500 (0:00:00.048) 0:05:28.273 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:45:36 -0500 (0:00:00.038) 0:05:28.312 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:45:36 -0500 (0:00:00.045) 0:05:28.357 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:45:36 -0500 (0:00:00.041) 0:05:28.398 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:45:36 -0500 (0:00:00.037) 0:05:28.435 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:45:36 -0500 (0:00:00.046) 0:05:28.482 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107123.119886, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737107123.119886, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 397153, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737107123.119886, "nlink": 1, "path": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:45:36 -0500 (0:00:00.327) 0:05:28.809 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:45:37 -0500 (0:00:00.652) 0:05:29.461 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.025320", "end": "2025-01-17 04:45:37.744920", "rc": 0, "start": "2025-01-17 04:45:37.719600" } STDOUT: LUKS header information for /dev/sda1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 34 18 f1 04 db 6b 2b f1 af d2 7a 5a e2 b6 f8 d1 1b 36 e3 11 MK salt: a5 b4 f0 91 44 bb 43 ec c4 a2 2b 5d b0 8d 4b a9 0d 87 ad 33 1f 96 9f 2d a7 11 fd 3c 90 59 69 51 MK iterations: 23206 UUID: fde34b0b-0d82-46ce-b76a-4930d8f4912d Key Slot 0: ENABLED Iterations: 370258 Salt: b3 2d 98 99 e6 1f da 52 38 0f 53 53 dd b4 b1 92 71 37 42 18 3b ee da cd c4 93 2b c5 69 c6 d8 4f Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:45:37 -0500 (0:00:00.366) 0:05:29.828 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:45:37 -0500 (0:00:00.064) 0:05:29.892 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:45:37 -0500 (0:00:00.049) 0:05:29.942 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:45:37 -0500 (0:00:00.047) 0:05:29.989 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:45:38 -0500 (0:00:00.048) 0:05:30.037 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:45:38 -0500 (0:00:00.041) 0:05:30.079 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:45:38 -0500 (0:00:00.040) 0:05:30.119 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:45:38 -0500 (0:00:00.040) 0:05:30.159 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:45:38 -0500 (0:00:00.050) 0:05:30.209 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:45:38 -0500 (0:00:00.045) 0:05:30.255 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:45:38 -0500 (0:00:00.048) 0:05:30.303 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:45:38 -0500 (0:00:00.048) 0:05:30.351 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:45:38 -0500 (0:00:00.049) 0:05:30.401 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:45:38 -0500 (0:00:00.038) 0:05:30.439 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:45:38 -0500 (0:00:00.039) 0:05:30.479 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:45:38 -0500 (0:00:00.040) 0:05:30.519 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:45:38 -0500 (0:00:00.038) 0:05:30.558 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:45:38 -0500 (0:00:00.037) 0:05:30.595 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:45:38 -0500 (0:00:00.037) 0:05:30.633 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:45:38 -0500 (0:00:00.038) 0:05:30.671 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:45:38 -0500 (0:00:00.047) 0:05:30.719 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:45:38 -0500 (0:00:00.049) 0:05:30.769 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:45:38 -0500 (0:00:00.060) 0:05:30.829 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:45:38 -0500 (0:00:00.058) 0:05:30.888 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:45:38 -0500 (0:00:00.083) 0:05:30.971 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:45:39 -0500 (0:00:00.063) 0:05:31.034 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:45:39 -0500 (0:00:00.064) 0:05:31.099 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:45:39 -0500 (0:00:00.065) 0:05:31.165 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:45:39 -0500 (0:00:00.061) 0:05:31.227 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:45:39 -0500 (0:00:00.058) 0:05:31.285 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:45:39 -0500 (0:00:00.060) 0:05:31.346 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:45:39 -0500 (0:00:00.058) 0:05:31.405 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:45:39 -0500 (0:00:00.063) 0:05:31.468 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:45:39 -0500 (0:00:00.056) 0:05:31.524 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:45:39 -0500 (0:00:00.055) 0:05:31.580 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:45:39 -0500 (0:00:00.056) 0:05:31.636 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:45:39 -0500 (0:00:00.045) 0:05:31.682 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:45:39 -0500 (0:00:00.045) 0:05:31.728 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:45:39 -0500 (0:00:00.053) 0:05:31.781 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:45:39 -0500 (0:00:00.046) 0:05:31.827 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:45:39 -0500 (0:00:00.039) 0:05:31.867 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:45:39 -0500 (0:00:00.045) 0:05:31.912 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:45:39 -0500 (0:00:00.054) 0:05:31.967 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:45:40 -0500 (0:00:00.071) 0:05:32.039 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:45:40 -0500 (0:00:00.058) 0:05:32.097 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:45:40 -0500 (0:00:00.070) 0:05:32.168 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:45:40 -0500 (0:00:00.062) 0:05:32.230 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:45:40 -0500 (0:00:00.064) 0:05:32.294 ******** ok: [managed-node1] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:45:40 -0500 (0:00:00.065) 0:05:32.360 ******** ok: [managed-node1] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:45:40 -0500 (0:00:00.058) 0:05:32.419 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:45:40 -0500 (0:00:00.085) 0:05:32.505 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:45:40 -0500 (0:00:00.060) 0:05:32.565 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:45:40 -0500 (0:00:00.060) 0:05:32.626 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:45:40 -0500 (0:00:00.057) 0:05:32.684 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:45:40 -0500 (0:00:00.067) 0:05:32.751 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:45:40 -0500 (0:00:00.076) 0:05:32.827 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:45:40 -0500 (0:00:00.087) 0:05:32.915 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:45:40 -0500 (0:00:00.059) 0:05:32.974 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:45:41 -0500 (0:00:00.087) 0:05:33.062 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:45:41 -0500 (0:00:00.059) 0:05:33.122 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:346 Friday 17 January 2025 04:45:41 -0500 (0:00:00.067) 0:05:33.189 ******** ok: [managed-node1] => { "changed": false, "path": "/tmp/storage_testaZEGMVlukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:356 Friday 17 January 2025 04:45:41 -0500 (0:00:00.505) 0:05:33.694 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node1 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 January 2025 04:45:41 -0500 (0:00:00.124) 0:05:33.818 ******** ok: [managed-node1] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 January 2025 04:45:41 -0500 (0:00:00.078) 0:05:33.897 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:45:41 -0500 (0:00:00.092) 0:05:33.989 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:45:42 -0500 (0:00:00.124) 0:05:34.114 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:45:42 -0500 (0:00:00.082) 0:05:34.197 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:45:42 -0500 (0:00:00.142) 0:05:34.339 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:45:42 -0500 (0:00:00.065) 0:05:34.405 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:45:42 -0500 (0:00:00.062) 0:05:34.467 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:45:42 -0500 (0:00:00.083) 0:05:34.551 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:45:42 -0500 (0:00:00.058) 0:05:34.609 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:45:42 -0500 (0:00:00.152) 0:05:34.762 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:45:46 -0500 (0:00:03.952) 0:05:38.714 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:45:46 -0500 (0:00:00.047) 0:05:38.762 ******** ok: [managed-node1] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:45:46 -0500 (0:00:00.045) 0:05:38.808 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:45:50 -0500 (0:00:03.835) 0:05:42.644 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:45:50 -0500 (0:00:00.087) 0:05:42.731 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:45:50 -0500 (0:00:00.036) 0:05:42.767 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:45:50 -0500 (0:00:00.035) 0:05:42.803 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:45:50 -0500 (0:00:00.036) 0:05:42.840 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:45:51 -0500 (0:00:00.699) 0:05:43.540 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:45:52 -0500 (0:00:01.043) 0:05:44.584 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:45:52 -0500 (0:00:00.080) 0:05:44.664 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:45:52 -0500 (0:00:00.057) 0:05:44.721 ******** fatal: [managed-node1]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Friday 17 January 2025 04:45:56 -0500 (0:00:04.068) 0:05:48.790 ******** fatal: [managed-node1]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': False, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'lvm', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'test1' missing key/password"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:45:56 -0500 (0:00:00.069) 0:05:48.859 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 17 January 2025 04:45:56 -0500 (0:00:00.043) 0:05:48.903 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 17 January 2025 04:45:56 -0500 (0:00:00.049) 0:05:48.953 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 17 January 2025 04:45:56 -0500 (0:00:00.057) 0:05:49.011 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:374 Friday 17 January 2025 04:45:57 -0500 (0:00:00.038) 0:05:49.049 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:45:57 -0500 (0:00:00.083) 0:05:49.132 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:45:57 -0500 (0:00:00.091) 0:05:49.223 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:45:57 -0500 (0:00:00.129) 0:05:49.353 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:45:57 -0500 (0:00:00.098) 0:05:49.452 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:45:57 -0500 (0:00:00.038) 0:05:49.490 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:45:57 -0500 (0:00:00.041) 0:05:49.531 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:45:57 -0500 (0:00:00.054) 0:05:49.585 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:45:57 -0500 (0:00:00.069) 0:05:49.655 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:45:57 -0500 (0:00:00.126) 0:05:49.781 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:46:01 -0500 (0:00:03.954) 0:05:53.736 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:46:01 -0500 (0:00:00.092) 0:05:53.829 ******** ok: [managed-node1] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:46:01 -0500 (0:00:00.062) 0:05:53.891 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:46:05 -0500 (0:00:03.971) 0:05:57.863 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:46:05 -0500 (0:00:00.080) 0:05:57.943 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:46:05 -0500 (0:00:00.041) 0:05:57.985 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:46:06 -0500 (0:00:00.046) 0:05:58.032 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:46:06 -0500 (0:00:00.035) 0:05:58.067 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:46:06 -0500 (0:00:00.642) 0:05:58.710 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:46:07 -0500 (0:00:01.034) 0:05:59.744 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:46:07 -0500 (0:00:00.090) 0:05:59.834 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:46:07 -0500 (0:00:00.054) 0:05:59.889 ******** changed: [managed-node1] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:46:18 -0500 (0:00:10.996) 0:06:10.885 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:46:18 -0500 (0:00:00.036) 0:06:10.922 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107125.4298882, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e2afe29c8e3814e30d3cac92110ff263f4fe1a5b", "ctime": 1737107125.4268882, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737107125.4268882, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:46:19 -0500 (0:00:00.396) 0:06:11.318 ******** ok: [managed-node1] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:46:19 -0500 (0:00:00.356) 0:06:11.674 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:46:19 -0500 (0:00:00.039) 0:06:11.714 ******** ok: [managed-node1] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:46:19 -0500 (0:00:00.063) 0:06:11.777 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:46:19 -0500 (0:00:00.045) 0:06:11.823 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:46:19 -0500 (0:00:00.043) 0:06:11.866 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:46:20 -0500 (0:00:00.346) 0:06:12.213 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:46:20 -0500 (0:00:00.515) 0:06:12.728 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:46:21 -0500 (0:00:00.428) 0:06:13.156 ******** skipping: [managed-node1] => (item={u'src': u'/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:46:21 -0500 (0:00:00.073) 0:06:13.230 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:46:21 -0500 (0:00:00.504) 0:06:13.734 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107130.5338936, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "55e1abf42d0357db34e7db9ced284ebd84259d76", "ctime": 1737107126.7288897, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263663, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1737107126.7278895, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 88, "uid": 0, "version": "18446744072031199911", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:46:22 -0500 (0:00:00.330) 0:06:14.065 ******** changed: [managed-node1] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node1] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-79e8fafe-36db-4db0-bb7e-9669c9da3637', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:46:22 -0500 (0:00:00.649) 0:06:14.714 ******** ok: [managed-node1] TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:393 Friday 17 January 2025 04:46:23 -0500 (0:00:00.693) 0:06:15.408 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:46:23 -0500 (0:00:00.073) 0:06:15.482 ******** ok: [managed-node1] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:46:23 -0500 (0:00:00.050) 0:06:15.532 ******** skipping: [managed-node1] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:46:23 -0500 (0:00:00.034) 0:06:15.567 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "79e8fafe-36db-4db0-bb7e-9669c9da3637" }, "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "size": "4G", "type": "crypt", "uuid": "16845911-a6ba-48f2-8094-6a7bfd5a0255" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "q5PvpB-4kw5-zuiW-X5c3-3C9L-v0UQ-y98OeM" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:46:23 -0500 (0:00:00.347) 0:06:15.914 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002922", "end": "2025-01-17 04:46:24.194537", "rc": 0, "start": "2025-01-17 04:46:24.191615" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:46:24 -0500 (0:00:00.387) 0:06:16.301 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002880", "end": "2025-01-17 04:46:24.566818", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:46:24.563938" } STDOUT: luks-79e8fafe-36db-4db0-bb7e-9669c9da3637 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:46:24 -0500 (0:00:00.367) 0:06:16.669 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node1 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 January 2025 04:46:24 -0500 (0:00:00.121) 0:06:16.790 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 January 2025 04:46:24 -0500 (0:00:00.058) 0:06:16.849 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.018467", "end": "2025-01-17 04:46:25.264655", "rc": 0, "start": "2025-01-17 04:46:25.246188" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 January 2025 04:46:25 -0500 (0:00:00.520) 0:06:17.369 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 January 2025 04:46:25 -0500 (0:00:00.069) 0:06:17.439 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 January 2025 04:46:25 -0500 (0:00:00.155) 0:06:17.594 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 January 2025 04:46:25 -0500 (0:00:00.055) 0:06:17.650 ******** ok: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 January 2025 04:46:26 -0500 (0:00:00.566) 0:06:18.216 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 January 2025 04:46:26 -0500 (0:00:00.050) 0:06:18.267 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 January 2025 04:46:26 -0500 (0:00:00.056) 0:06:18.323 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 January 2025 04:46:26 -0500 (0:00:00.050) 0:06:18.374 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 January 2025 04:46:26 -0500 (0:00:00.043) 0:06:18.417 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 January 2025 04:46:26 -0500 (0:00:00.048) 0:06:18.466 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Friday 17 January 2025 04:46:26 -0500 (0:00:00.051) 0:06:18.518 ******** ok: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Friday 17 January 2025 04:46:26 -0500 (0:00:00.100) 0:06:18.618 ******** ok: [managed-node1] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.46.65 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Friday 17 January 2025 04:46:26 -0500 (0:00:00.251) 0:06:18.870 ******** skipping: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Friday 17 January 2025 04:46:26 -0500 (0:00:00.042) 0:06:18.913 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node1 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 January 2025 04:46:26 -0500 (0:00:00.078) 0:06:18.991 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 January 2025 04:46:27 -0500 (0:00:00.044) 0:06:19.036 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 January 2025 04:46:27 -0500 (0:00:00.056) 0:06:19.092 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 January 2025 04:46:27 -0500 (0:00:00.059) 0:06:19.151 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 January 2025 04:46:27 -0500 (0:00:00.133) 0:06:19.285 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 January 2025 04:46:27 -0500 (0:00:00.048) 0:06:19.334 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 January 2025 04:46:27 -0500 (0:00:00.047) 0:06:19.381 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 January 2025 04:46:27 -0500 (0:00:00.036) 0:06:19.418 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 January 2025 04:46:27 -0500 (0:00:00.036) 0:06:19.454 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 January 2025 04:46:27 -0500 (0:00:00.036) 0:06:19.491 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 January 2025 04:46:27 -0500 (0:00:00.038) 0:06:19.530 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Friday 17 January 2025 04:46:27 -0500 (0:00:00.037) 0:06:19.567 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node1 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 January 2025 04:46:27 -0500 (0:00:00.110) 0:06:19.678 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node1 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Friday 17 January 2025 04:46:27 -0500 (0:00:00.110) 0:06:19.788 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Friday 17 January 2025 04:46:27 -0500 (0:00:00.047) 0:06:19.835 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Friday 17 January 2025 04:46:27 -0500 (0:00:00.046) 0:06:19.882 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Friday 17 January 2025 04:46:27 -0500 (0:00:00.037) 0:06:19.920 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Friday 17 January 2025 04:46:27 -0500 (0:00:00.037) 0:06:19.957 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Friday 17 January 2025 04:46:27 -0500 (0:00:00.045) 0:06:20.003 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Friday 17 January 2025 04:46:28 -0500 (0:00:00.052) 0:06:20.056 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Friday 17 January 2025 04:46:28 -0500 (0:00:00.057) 0:06:20.113 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node1 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 January 2025 04:46:28 -0500 (0:00:00.108) 0:06:20.222 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node1 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Friday 17 January 2025 04:46:28 -0500 (0:00:00.107) 0:06:20.329 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Friday 17 January 2025 04:46:28 -0500 (0:00:00.039) 0:06:20.369 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Friday 17 January 2025 04:46:28 -0500 (0:00:00.036) 0:06:20.406 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Friday 17 January 2025 04:46:28 -0500 (0:00:00.036) 0:06:20.442 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Friday 17 January 2025 04:46:28 -0500 (0:00:00.038) 0:06:20.481 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 January 2025 04:46:28 -0500 (0:00:00.086) 0:06:20.568 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 January 2025 04:46:28 -0500 (0:00:00.044) 0:06:20.613 ******** skipping: [managed-node1] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 January 2025 04:46:28 -0500 (0:00:00.052) 0:06:20.665 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node1 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Friday 17 January 2025 04:46:28 -0500 (0:00:00.076) 0:06:20.742 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Friday 17 January 2025 04:46:28 -0500 (0:00:00.052) 0:06:20.794 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Friday 17 January 2025 04:46:28 -0500 (0:00:00.061) 0:06:20.856 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Friday 17 January 2025 04:46:28 -0500 (0:00:00.055) 0:06:20.911 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Friday 17 January 2025 04:46:28 -0500 (0:00:00.055) 0:06:20.966 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Friday 17 January 2025 04:46:29 -0500 (0:00:00.059) 0:06:21.026 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 January 2025 04:46:29 -0500 (0:00:00.083) 0:06:21.109 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Friday 17 January 2025 04:46:29 -0500 (0:00:00.112) 0:06:21.221 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node1 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 January 2025 04:46:29 -0500 (0:00:00.166) 0:06:21.387 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node1 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Friday 17 January 2025 04:46:29 -0500 (0:00:00.121) 0:06:21.509 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Friday 17 January 2025 04:46:29 -0500 (0:00:00.075) 0:06:21.585 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Friday 17 January 2025 04:46:29 -0500 (0:00:00.060) 0:06:21.646 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Friday 17 January 2025 04:46:29 -0500 (0:00:00.099) 0:06:21.745 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Friday 17 January 2025 04:46:29 -0500 (0:00:00.055) 0:06:21.800 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Friday 17 January 2025 04:46:29 -0500 (0:00:00.057) 0:06:21.858 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Friday 17 January 2025 04:46:29 -0500 (0:00:00.049) 0:06:21.908 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Friday 17 January 2025 04:46:29 -0500 (0:00:00.038) 0:06:21.947 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node1 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 January 2025 04:46:30 -0500 (0:00:00.100) 0:06:22.047 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Friday 17 January 2025 04:46:30 -0500 (0:00:00.046) 0:06:22.093 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 January 2025 04:46:30 -0500 (0:00:00.068) 0:06:22.161 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Friday 17 January 2025 04:46:30 -0500 (0:00:00.070) 0:06:22.232 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Friday 17 January 2025 04:46:30 -0500 (0:00:00.060) 0:06:22.292 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Friday 17 January 2025 04:46:30 -0500 (0:00:00.076) 0:06:22.368 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Friday 17 January 2025 04:46:30 -0500 (0:00:00.063) 0:06:22.432 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 January 2025 04:46:30 -0500 (0:00:00.163) 0:06:22.596 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:46:30 -0500 (0:00:00.122) 0:06:22.718 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:46:30 -0500 (0:00:00.101) 0:06:22.819 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:46:31 -0500 (0:00:00.334) 0:06:23.153 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:46:31 -0500 (0:00:00.067) 0:06:23.220 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:46:31 -0500 (0:00:00.070) 0:06:23.291 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:46:31 -0500 (0:00:00.065) 0:06:23.356 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:46:31 -0500 (0:00:00.079) 0:06:23.436 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:46:31 -0500 (0:00:00.100) 0:06:23.536 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:46:31 -0500 (0:00:00.064) 0:06:23.600 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:46:31 -0500 (0:00:00.056) 0:06:23.657 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:46:31 -0500 (0:00:00.065) 0:06:23.723 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:46:31 -0500 (0:00:00.056) 0:06:23.779 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:46:31 -0500 (0:00:00.098) 0:06:23.878 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:46:31 -0500 (0:00:00.087) 0:06:23.965 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:46:32 -0500 (0:00:00.110) 0:06:24.075 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:46:32 -0500 (0:00:00.070) 0:06:24.146 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:46:32 -0500 (0:00:00.066) 0:06:24.212 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:46:32 -0500 (0:00:00.053) 0:06:24.266 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:46:32 -0500 (0:00:00.068) 0:06:24.335 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:46:32 -0500 (0:00:00.062) 0:06:24.398 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:46:32 -0500 (0:00:00.078) 0:06:24.476 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:46:32 -0500 (0:00:00.075) 0:06:24.551 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107178.6379445, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737107178.6379445, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 407359, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737107178.6379445, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:46:32 -0500 (0:00:00.401) 0:06:24.953 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:46:33 -0500 (0:00:00.106) 0:06:25.060 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:46:33 -0500 (0:00:00.074) 0:06:25.135 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:46:33 -0500 (0:00:00.113) 0:06:25.248 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:46:33 -0500 (0:00:00.076) 0:06:25.325 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:46:33 -0500 (0:00:00.070) 0:06:25.396 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:46:33 -0500 (0:00:00.068) 0:06:25.464 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107178.7499444, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737107178.7499444, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 407946, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737107178.7499444, "nlink": 1, "path": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:46:33 -0500 (0:00:00.416) 0:06:25.881 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:46:34 -0500 (0:00:00.768) 0:06:26.650 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.026218", "end": "2025-01-17 04:46:35.115209", "rc": 0, "start": "2025-01-17 04:46:35.088991" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 45 51 52 ee cc ce 35 f8 51 94 24 30 bc b8 f1 26 a3 1d 43 70 MK salt: fb b0 6e c2 67 5a f0 e7 79 ef a8 f6 0a c6 6e c9 da 7a 34 16 8c eb e4 c1 1b de 06 22 29 a2 91 bb MK iterations: 23372 UUID: 79e8fafe-36db-4db0-bb7e-9669c9da3637 Key Slot 0: ENABLED Iterations: 373956 Salt: 8a 58 8f 31 cd 55 35 42 05 09 1e 07 25 19 93 b3 2f 24 61 a0 21 ad a8 fc 64 e2 df a6 58 73 2e 95 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:46:35 -0500 (0:00:00.591) 0:06:27.242 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:46:35 -0500 (0:00:00.070) 0:06:27.313 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:46:35 -0500 (0:00:00.081) 0:06:27.395 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:46:35 -0500 (0:00:00.079) 0:06:27.474 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:46:35 -0500 (0:00:00.088) 0:06:27.562 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:46:35 -0500 (0:00:00.087) 0:06:27.650 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:46:35 -0500 (0:00:00.091) 0:06:27.742 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:46:35 -0500 (0:00:00.078) 0:06:27.821 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-79e8fafe-36db-4db0-bb7e-9669c9da3637 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:46:35 -0500 (0:00:00.074) 0:06:27.895 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:46:35 -0500 (0:00:00.067) 0:06:27.962 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:46:36 -0500 (0:00:00.069) 0:06:28.032 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:46:36 -0500 (0:00:00.077) 0:06:28.110 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:46:36 -0500 (0:00:00.074) 0:06:28.184 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:46:36 -0500 (0:00:00.057) 0:06:28.242 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:46:36 -0500 (0:00:00.062) 0:06:28.305 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:46:36 -0500 (0:00:00.199) 0:06:28.505 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:46:36 -0500 (0:00:00.056) 0:06:28.562 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:46:36 -0500 (0:00:00.060) 0:06:28.622 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:46:36 -0500 (0:00:00.139) 0:06:28.762 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:46:36 -0500 (0:00:00.128) 0:06:28.890 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:46:37 -0500 (0:00:00.132) 0:06:29.022 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:46:37 -0500 (0:00:00.074) 0:06:29.097 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:46:37 -0500 (0:00:00.075) 0:06:29.173 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:46:37 -0500 (0:00:00.082) 0:06:29.255 ******** ok: [managed-node1] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:46:37 -0500 (0:00:00.564) 0:06:29.819 ******** ok: [managed-node1] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:46:38 -0500 (0:00:00.529) 0:06:30.349 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:46:38 -0500 (0:00:00.082) 0:06:30.431 ******** ok: [managed-node1] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:46:38 -0500 (0:00:00.072) 0:06:30.504 ******** ok: [managed-node1] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:46:39 -0500 (0:00:00.665) 0:06:31.169 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:46:39 -0500 (0:00:00.079) 0:06:31.248 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:46:39 -0500 (0:00:00.093) 0:06:31.342 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:46:39 -0500 (0:00:00.136) 0:06:31.478 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:46:39 -0500 (0:00:00.125) 0:06:31.604 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:46:39 -0500 (0:00:00.104) 0:06:31.709 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:46:39 -0500 (0:00:00.078) 0:06:31.787 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:46:39 -0500 (0:00:00.058) 0:06:31.846 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:46:39 -0500 (0:00:00.059) 0:06:31.905 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:46:39 -0500 (0:00:00.057) 0:06:31.963 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:46:40 -0500 (0:00:00.066) 0:06:32.030 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:46:40 -0500 (0:00:00.058) 0:06:32.089 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:46:40 -0500 (0:00:00.066) 0:06:32.155 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:46:40 -0500 (0:00:00.058) 0:06:32.214 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:46:40 -0500 (0:00:00.062) 0:06:32.276 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:46:40 -0500 (0:00:00.046) 0:06:32.323 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:46:40 -0500 (0:00:00.050) 0:06:32.373 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:46:40 -0500 (0:00:00.049) 0:06:32.423 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:46:40 -0500 (0:00:00.056) 0:06:32.480 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:46:40 -0500 (0:00:00.039) 0:06:32.519 ******** ok: [managed-node1] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:46:40 -0500 (0:00:00.041) 0:06:32.561 ******** ok: [managed-node1] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:46:40 -0500 (0:00:00.041) 0:06:32.602 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:46:40 -0500 (0:00:00.052) 0:06:32.654 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.020821", "end": "2025-01-17 04:46:40.942157", "rc": 0, "start": "2025-01-17 04:46:40.921336" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:46:41 -0500 (0:00:00.370) 0:06:33.025 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:46:41 -0500 (0:00:00.049) 0:06:33.075 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:46:41 -0500 (0:00:00.048) 0:06:33.123 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:46:41 -0500 (0:00:00.042) 0:06:33.166 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:46:41 -0500 (0:00:00.041) 0:06:33.207 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:46:41 -0500 (0:00:00.040) 0:06:33.248 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:46:41 -0500 (0:00:00.039) 0:06:33.287 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:46:41 -0500 (0:00:00.037) 0:06:33.325 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:46:41 -0500 (0:00:00.032) 0:06:33.357 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:396 Friday 17 January 2025 04:46:41 -0500 (0:00:00.040) 0:06:33.398 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:46:41 -0500 (0:00:00.096) 0:06:33.494 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:46:41 -0500 (0:00:00.059) 0:06:33.553 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:46:41 -0500 (0:00:00.044) 0:06:33.598 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:46:41 -0500 (0:00:00.088) 0:06:33.686 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:46:41 -0500 (0:00:00.036) 0:06:33.723 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:46:41 -0500 (0:00:00.036) 0:06:33.759 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:46:41 -0500 (0:00:00.037) 0:06:33.797 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:46:41 -0500 (0:00:00.039) 0:06:33.836 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:46:41 -0500 (0:00:00.083) 0:06:33.919 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:46:45 -0500 (0:00:03.872) 0:06:37.792 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:46:45 -0500 (0:00:00.065) 0:06:37.857 ******** ok: [managed-node1] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:46:45 -0500 (0:00:00.066) 0:06:37.924 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:46:50 -0500 (0:00:04.101) 0:06:42.026 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:46:50 -0500 (0:00:00.097) 0:06:42.123 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:46:50 -0500 (0:00:00.091) 0:06:42.214 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:46:50 -0500 (0:00:00.057) 0:06:42.271 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:46:50 -0500 (0:00:00.042) 0:06:42.314 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:46:51 -0500 (0:00:00.780) 0:06:43.095 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2dfde34b0b\\x2d0d82\\x2d46ce\\x2db76a\\x2d4930d8f4912d.service": { "name": "systemd-cryptsetup@luks\\x2dfde34b0b\\x2d0d82\\x2d46ce\\x2db76a\\x2d4930d8f4912d.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:46:52 -0500 (0:00:01.181) 0:06:44.276 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dfde34b0b\\x2d0d82\\x2d46ce\\x2db76a\\x2d4930d8f4912d.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:46:52 -0500 (0:00:00.077) 0:06:44.354 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2dfde34b0b\x2d0d82\x2d46ce\x2db76a\x2d4930d8f4912d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dfde34b0b\\x2d0d82\\x2d46ce\\x2db76a\\x2d4930d8f4912d.service", "name": "systemd-cryptsetup@luks\\x2dfde34b0b\\x2d0d82\\x2d46ce\\x2db76a\\x2d4930d8f4912d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-collect.service system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target systemd-readahead-replay.service dev-sda1.device systemd-journald.socket", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-fde34b0b-0d82-46ce-b76a-4930d8f4912d ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dfde34b0b\\x2d0d82\\x2d46ce\\x2db76a\\x2d4930d8f4912d.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2dfde34b0b\\x2d0d82\\x2d46ce\\x2db76a\\x2d4930d8f4912d.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2dfde34b0b\\x2d0d82\\x2d46ce\\x2db76a\\x2d4930d8f4912d.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:46:52 -0500 (0:00:00.645) 0:06:44.999 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:46:57 -0500 (0:00:04.317) 0:06:49.317 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:46:57 -0500 (0:00:00.127) 0:06:49.444 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107181.0259469, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ced3592ae7b2c54427438b85f160a5132d55fc74", "ctime": 1737107181.0229468, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737107181.0229468, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:46:57 -0500 (0:00:00.513) 0:06:49.958 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:46:58 -0500 (0:00:00.062) 0:06:50.020 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2dfde34b0b\x2d0d82\x2d46ce\x2db76a\x2d4930d8f4912d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dfde34b0b\\x2d0d82\\x2d46ce\\x2db76a\\x2d4930d8f4912d.service", "name": "systemd-cryptsetup@luks\\x2dfde34b0b\\x2d0d82\\x2d46ce\\x2db76a\\x2d4930d8f4912d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dfde34b0b\\x2d0d82\\x2d46ce\\x2db76a\\x2d4930d8f4912d.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2dfde34b0b\\x2d0d82\\x2d46ce\\x2db76a\\x2d4930d8f4912d.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2dfde34b0b\\x2d0d82\\x2d46ce\\x2db76a\\x2d4930d8f4912d.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:46:58 -0500 (0:00:00.777) 0:06:50.798 ******** ok: [managed-node1] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:46:58 -0500 (0:00:00.142) 0:06:50.940 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:46:58 -0500 (0:00:00.072) 0:06:51.012 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:46:59 -0500 (0:00:00.070) 0:06:51.082 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:46:59 -0500 (0:00:00.055) 0:06:51.137 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:46:59 -0500 (0:00:00.592) 0:06:51.730 ******** ok: [managed-node1] => (item={u'src': u'/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:47:00 -0500 (0:00:00.516) 0:06:52.246 ******** skipping: [managed-node1] => (item={u'src': u'/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:47:00 -0500 (0:00:00.088) 0:06:52.335 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:47:00 -0500 (0:00:00.659) 0:06:52.995 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107184.5649507, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "36a7767fecac79f015a3f808ca7d6b6631573205", "ctime": 1737107182.6249485, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263663, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1737107182.6229486, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "18446744072031200077", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:47:01 -0500 (0:00:00.569) 0:06:53.565 ******** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:47:01 -0500 (0:00:00.069) 0:06:53.634 ******** ok: [managed-node1] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:410 Friday 17 January 2025 04:47:02 -0500 (0:00:00.947) 0:06:54.581 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:417 Friday 17 January 2025 04:47:02 -0500 (0:00:00.100) 0:06:54.681 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:47:02 -0500 (0:00:00.133) 0:06:54.815 ******** ok: [managed-node1] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:47:02 -0500 (0:00:00.086) 0:06:54.901 ******** skipping: [managed-node1] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:47:02 -0500 (0:00:00.070) 0:06:54.972 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "79e8fafe-36db-4db0-bb7e-9669c9da3637" }, "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "size": "4G", "type": "crypt", "uuid": "16845911-a6ba-48f2-8094-6a7bfd5a0255" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "q5PvpB-4kw5-zuiW-X5c3-3C9L-v0UQ-y98OeM" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:47:03 -0500 (0:00:00.544) 0:06:55.516 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002911", "end": "2025-01-17 04:47:03.840469", "rc": 0, "start": "2025-01-17 04:47:03.837558" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:47:03 -0500 (0:00:00.466) 0:06:55.983 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002792", "end": "2025-01-17 04:47:04.508626", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:47:04.505834" } STDOUT: luks-79e8fafe-36db-4db0-bb7e-9669c9da3637 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:47:04 -0500 (0:00:00.683) 0:06:56.666 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node1 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 January 2025 04:47:04 -0500 (0:00:00.149) 0:06:56.816 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 January 2025 04:47:04 -0500 (0:00:00.083) 0:06:56.900 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.020524", "end": "2025-01-17 04:47:05.416848", "rc": 0, "start": "2025-01-17 04:47:05.396324" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 January 2025 04:47:05 -0500 (0:00:00.640) 0:06:57.540 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 January 2025 04:47:05 -0500 (0:00:00.108) 0:06:57.648 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 January 2025 04:47:05 -0500 (0:00:00.168) 0:06:57.817 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 January 2025 04:47:05 -0500 (0:00:00.072) 0:06:57.890 ******** ok: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 January 2025 04:47:06 -0500 (0:00:00.505) 0:06:58.395 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 January 2025 04:47:06 -0500 (0:00:00.128) 0:06:58.524 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 January 2025 04:47:06 -0500 (0:00:00.098) 0:06:58.622 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 January 2025 04:47:06 -0500 (0:00:00.120) 0:06:58.742 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 January 2025 04:47:06 -0500 (0:00:00.087) 0:06:58.829 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 January 2025 04:47:06 -0500 (0:00:00.076) 0:06:58.906 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Friday 17 January 2025 04:47:06 -0500 (0:00:00.056) 0:06:58.963 ******** ok: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Friday 17 January 2025 04:47:07 -0500 (0:00:00.080) 0:06:59.043 ******** ok: [managed-node1] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.46.65 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Friday 17 January 2025 04:47:07 -0500 (0:00:00.303) 0:06:59.347 ******** skipping: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Friday 17 January 2025 04:47:07 -0500 (0:00:00.063) 0:06:59.410 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node1 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 January 2025 04:47:07 -0500 (0:00:00.118) 0:06:59.529 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 January 2025 04:47:07 -0500 (0:00:00.056) 0:06:59.586 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 January 2025 04:47:07 -0500 (0:00:00.056) 0:06:59.642 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 January 2025 04:47:07 -0500 (0:00:00.055) 0:06:59.697 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 January 2025 04:47:07 -0500 (0:00:00.054) 0:06:59.752 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 January 2025 04:47:07 -0500 (0:00:00.062) 0:06:59.815 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 January 2025 04:47:07 -0500 (0:00:00.064) 0:06:59.879 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 January 2025 04:47:07 -0500 (0:00:00.060) 0:06:59.939 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 January 2025 04:47:07 -0500 (0:00:00.056) 0:06:59.996 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 January 2025 04:47:08 -0500 (0:00:00.056) 0:07:00.052 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 January 2025 04:47:08 -0500 (0:00:00.058) 0:07:00.111 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Friday 17 January 2025 04:47:08 -0500 (0:00:00.067) 0:07:00.179 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node1 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 January 2025 04:47:08 -0500 (0:00:00.108) 0:07:00.288 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node1 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Friday 17 January 2025 04:47:08 -0500 (0:00:00.094) 0:07:00.382 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Friday 17 January 2025 04:47:08 -0500 (0:00:00.041) 0:07:00.423 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Friday 17 January 2025 04:47:08 -0500 (0:00:00.038) 0:07:00.462 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Friday 17 January 2025 04:47:08 -0500 (0:00:00.043) 0:07:00.506 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Friday 17 January 2025 04:47:08 -0500 (0:00:00.055) 0:07:00.562 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Friday 17 January 2025 04:47:08 -0500 (0:00:00.057) 0:07:00.619 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Friday 17 January 2025 04:47:08 -0500 (0:00:00.057) 0:07:00.676 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Friday 17 January 2025 04:47:08 -0500 (0:00:00.061) 0:07:00.738 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node1 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 January 2025 04:47:08 -0500 (0:00:00.133) 0:07:00.871 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node1 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Friday 17 January 2025 04:47:08 -0500 (0:00:00.117) 0:07:00.989 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Friday 17 January 2025 04:47:09 -0500 (0:00:00.057) 0:07:01.047 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Friday 17 January 2025 04:47:09 -0500 (0:00:00.055) 0:07:01.103 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Friday 17 January 2025 04:47:09 -0500 (0:00:00.056) 0:07:01.159 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Friday 17 January 2025 04:47:09 -0500 (0:00:00.056) 0:07:01.216 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 January 2025 04:47:09 -0500 (0:00:00.194) 0:07:01.411 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 January 2025 04:47:09 -0500 (0:00:00.048) 0:07:01.459 ******** skipping: [managed-node1] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 January 2025 04:47:09 -0500 (0:00:00.049) 0:07:01.508 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node1 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Friday 17 January 2025 04:47:09 -0500 (0:00:00.072) 0:07:01.581 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Friday 17 January 2025 04:47:09 -0500 (0:00:00.046) 0:07:01.627 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Friday 17 January 2025 04:47:09 -0500 (0:00:00.045) 0:07:01.672 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Friday 17 January 2025 04:47:09 -0500 (0:00:00.037) 0:07:01.710 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Friday 17 January 2025 04:47:09 -0500 (0:00:00.042) 0:07:01.752 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Friday 17 January 2025 04:47:09 -0500 (0:00:00.052) 0:07:01.804 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 January 2025 04:47:09 -0500 (0:00:00.060) 0:07:01.865 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Friday 17 January 2025 04:47:09 -0500 (0:00:00.092) 0:07:01.958 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node1 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 January 2025 04:47:10 -0500 (0:00:00.136) 0:07:02.094 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node1 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Friday 17 January 2025 04:47:10 -0500 (0:00:00.129) 0:07:02.224 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Friday 17 January 2025 04:47:10 -0500 (0:00:00.059) 0:07:02.284 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Friday 17 January 2025 04:47:10 -0500 (0:00:00.058) 0:07:02.343 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Friday 17 January 2025 04:47:10 -0500 (0:00:00.066) 0:07:02.409 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Friday 17 January 2025 04:47:10 -0500 (0:00:00.059) 0:07:02.468 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Friday 17 January 2025 04:47:10 -0500 (0:00:00.058) 0:07:02.526 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Friday 17 January 2025 04:47:10 -0500 (0:00:00.065) 0:07:02.592 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Friday 17 January 2025 04:47:10 -0500 (0:00:00.059) 0:07:02.652 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node1 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 January 2025 04:47:10 -0500 (0:00:00.152) 0:07:02.804 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Friday 17 January 2025 04:47:10 -0500 (0:00:00.057) 0:07:02.862 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 January 2025 04:47:10 -0500 (0:00:00.056) 0:07:02.919 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Friday 17 January 2025 04:47:10 -0500 (0:00:00.075) 0:07:02.994 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Friday 17 January 2025 04:47:11 -0500 (0:00:00.060) 0:07:03.054 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Friday 17 January 2025 04:47:11 -0500 (0:00:00.059) 0:07:03.114 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Friday 17 January 2025 04:47:11 -0500 (0:00:00.058) 0:07:03.173 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 January 2025 04:47:11 -0500 (0:00:00.062) 0:07:03.235 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:47:11 -0500 (0:00:00.095) 0:07:03.331 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:47:11 -0500 (0:00:00.058) 0:07:03.390 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:47:11 -0500 (0:00:00.227) 0:07:03.618 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:47:11 -0500 (0:00:00.069) 0:07:03.688 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:47:11 -0500 (0:00:00.080) 0:07:03.768 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:47:11 -0500 (0:00:00.071) 0:07:03.840 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:47:11 -0500 (0:00:00.068) 0:07:03.909 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:47:11 -0500 (0:00:00.052) 0:07:03.961 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:47:12 -0500 (0:00:00.062) 0:07:04.024 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:47:12 -0500 (0:00:00.045) 0:07:04.069 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:47:12 -0500 (0:00:00.057) 0:07:04.127 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:47:12 -0500 (0:00:00.068) 0:07:04.196 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:47:12 -0500 (0:00:00.070) 0:07:04.266 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:47:12 -0500 (0:00:00.081) 0:07:04.348 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:47:12 -0500 (0:00:00.099) 0:07:04.448 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:47:12 -0500 (0:00:00.069) 0:07:04.517 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:47:12 -0500 (0:00:00.060) 0:07:04.577 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:47:12 -0500 (0:00:00.037) 0:07:04.615 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:47:12 -0500 (0:00:00.046) 0:07:04.661 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:47:12 -0500 (0:00:00.059) 0:07:04.721 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:47:12 -0500 (0:00:00.076) 0:07:04.798 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:47:12 -0500 (0:00:00.209) 0:07:05.007 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107195.1039617, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737107178.6379445, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 407359, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737107178.6379445, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:47:13 -0500 (0:00:00.511) 0:07:05.519 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:47:13 -0500 (0:00:00.073) 0:07:05.592 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:47:13 -0500 (0:00:00.084) 0:07:05.677 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:47:13 -0500 (0:00:00.076) 0:07:05.754 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:47:13 -0500 (0:00:00.067) 0:07:05.822 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:47:13 -0500 (0:00:00.061) 0:07:05.883 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:47:13 -0500 (0:00:00.073) 0:07:05.957 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107178.7499444, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737107178.7499444, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 407946, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737107178.7499444, "nlink": 1, "path": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:47:14 -0500 (0:00:00.455) 0:07:06.413 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:47:15 -0500 (0:00:00.788) 0:07:07.202 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.026079", "end": "2025-01-17 04:47:15.531278", "rc": 0, "start": "2025-01-17 04:47:15.505199" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 45 51 52 ee cc ce 35 f8 51 94 24 30 bc b8 f1 26 a3 1d 43 70 MK salt: fb b0 6e c2 67 5a f0 e7 79 ef a8 f6 0a c6 6e c9 da 7a 34 16 8c eb e4 c1 1b de 06 22 29 a2 91 bb MK iterations: 23372 UUID: 79e8fafe-36db-4db0-bb7e-9669c9da3637 Key Slot 0: ENABLED Iterations: 373956 Salt: 8a 58 8f 31 cd 55 35 42 05 09 1e 07 25 19 93 b3 2f 24 61 a0 21 ad a8 fc 64 e2 df a6 58 73 2e 95 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:47:15 -0500 (0:00:00.480) 0:07:07.682 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:47:15 -0500 (0:00:00.090) 0:07:07.773 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:47:15 -0500 (0:00:00.099) 0:07:07.872 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:47:15 -0500 (0:00:00.082) 0:07:07.955 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:47:16 -0500 (0:00:00.073) 0:07:08.029 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:47:16 -0500 (0:00:00.081) 0:07:08.110 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:47:16 -0500 (0:00:00.061) 0:07:08.172 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:47:16 -0500 (0:00:00.072) 0:07:08.244 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-79e8fafe-36db-4db0-bb7e-9669c9da3637 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:47:16 -0500 (0:00:00.072) 0:07:08.316 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:47:16 -0500 (0:00:00.124) 0:07:08.441 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:47:16 -0500 (0:00:00.088) 0:07:08.530 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:47:16 -0500 (0:00:00.108) 0:07:08.638 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:47:16 -0500 (0:00:00.149) 0:07:08.788 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:47:16 -0500 (0:00:00.108) 0:07:08.896 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:47:16 -0500 (0:00:00.064) 0:07:08.960 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:47:17 -0500 (0:00:00.079) 0:07:09.039 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:47:17 -0500 (0:00:00.059) 0:07:09.099 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:47:17 -0500 (0:00:00.057) 0:07:09.157 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:47:17 -0500 (0:00:00.048) 0:07:09.206 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:47:17 -0500 (0:00:00.045) 0:07:09.251 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:47:17 -0500 (0:00:00.059) 0:07:09.311 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:47:17 -0500 (0:00:00.061) 0:07:09.372 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:47:17 -0500 (0:00:00.048) 0:07:09.421 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:47:17 -0500 (0:00:00.052) 0:07:09.473 ******** ok: [managed-node1] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:47:17 -0500 (0:00:00.338) 0:07:09.812 ******** ok: [managed-node1] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:47:18 -0500 (0:00:00.359) 0:07:10.172 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:47:18 -0500 (0:00:00.075) 0:07:10.247 ******** ok: [managed-node1] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:47:18 -0500 (0:00:00.083) 0:07:10.330 ******** ok: [managed-node1] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:47:18 -0500 (0:00:00.391) 0:07:10.722 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:47:18 -0500 (0:00:00.062) 0:07:10.784 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:47:18 -0500 (0:00:00.064) 0:07:10.849 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:47:18 -0500 (0:00:00.066) 0:07:10.915 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:47:18 -0500 (0:00:00.063) 0:07:10.979 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:47:19 -0500 (0:00:00.055) 0:07:11.034 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:47:19 -0500 (0:00:00.057) 0:07:11.092 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:47:19 -0500 (0:00:00.057) 0:07:11.149 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:47:19 -0500 (0:00:00.058) 0:07:11.207 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:47:19 -0500 (0:00:00.060) 0:07:11.268 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:47:19 -0500 (0:00:00.058) 0:07:11.327 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:47:19 -0500 (0:00:00.092) 0:07:11.420 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:47:19 -0500 (0:00:00.057) 0:07:11.477 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:47:19 -0500 (0:00:00.054) 0:07:11.532 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:47:19 -0500 (0:00:00.055) 0:07:11.587 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:47:19 -0500 (0:00:00.054) 0:07:11.641 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:47:19 -0500 (0:00:00.048) 0:07:11.690 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:47:19 -0500 (0:00:00.048) 0:07:11.739 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:47:19 -0500 (0:00:00.046) 0:07:11.785 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:47:19 -0500 (0:00:00.038) 0:07:11.824 ******** ok: [managed-node1] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:47:19 -0500 (0:00:00.042) 0:07:11.866 ******** ok: [managed-node1] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:47:19 -0500 (0:00:00.048) 0:07:11.915 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:47:19 -0500 (0:00:00.075) 0:07:11.991 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.020542", "end": "2025-01-17 04:47:20.311960", "rc": 0, "start": "2025-01-17 04:47:20.291418" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:47:20 -0500 (0:00:00.432) 0:07:12.424 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:47:20 -0500 (0:00:00.073) 0:07:12.498 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:47:20 -0500 (0:00:00.064) 0:07:12.563 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:47:20 -0500 (0:00:00.047) 0:07:12.610 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:47:20 -0500 (0:00:00.044) 0:07:12.655 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:47:20 -0500 (0:00:00.041) 0:07:12.697 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:47:20 -0500 (0:00:00.042) 0:07:12.740 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:47:20 -0500 (0:00:00.102) 0:07:12.842 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:47:20 -0500 (0:00:00.035) 0:07:12.877 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 January 2025 04:47:20 -0500 (0:00:00.038) 0:07:12.916 ******** changed: [managed-node1] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:423 Friday 17 January 2025 04:47:21 -0500 (0:00:00.411) 0:07:13.328 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node1 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 January 2025 04:47:21 -0500 (0:00:00.118) 0:07:13.446 ******** ok: [managed-node1] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 January 2025 04:47:21 -0500 (0:00:00.086) 0:07:13.533 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:47:21 -0500 (0:00:00.094) 0:07:13.628 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:47:21 -0500 (0:00:00.092) 0:07:13.720 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:47:21 -0500 (0:00:00.071) 0:07:13.792 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:47:21 -0500 (0:00:00.137) 0:07:13.929 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:47:21 -0500 (0:00:00.057) 0:07:13.987 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:47:22 -0500 (0:00:00.063) 0:07:14.051 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:47:22 -0500 (0:00:00.072) 0:07:14.124 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:47:22 -0500 (0:00:00.058) 0:07:14.182 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:47:22 -0500 (0:00:00.157) 0:07:14.340 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:47:26 -0500 (0:00:04.050) 0:07:18.390 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:47:26 -0500 (0:00:00.050) 0:07:18.441 ******** ok: [managed-node1] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:47:26 -0500 (0:00:00.048) 0:07:18.490 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:47:30 -0500 (0:00:04.247) 0:07:22.737 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:47:30 -0500 (0:00:00.108) 0:07:22.846 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:47:30 -0500 (0:00:00.055) 0:07:22.901 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:47:30 -0500 (0:00:00.062) 0:07:22.964 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:47:31 -0500 (0:00:00.064) 0:07:23.028 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:47:31 -0500 (0:00:00.758) 0:07:23.787 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service": { "name": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:47:33 -0500 (0:00:01.271) 0:07:25.058 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:47:33 -0500 (0:00:00.136) 0:07:25.195 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2d79e8fafe\x2d36db\x2d4db0\x2dbb7e\x2d9669c9da3637.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "name": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-mapper-foo\\x2dtest1.device systemd-journald.socket systemd-readahead-collect.service cryptsetup-pre.target systemd-readahead-replay.service system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-79e8fafe-36db-4db0-bb7e-9669c9da3637 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-79e8fafe-36db-4db0-bb7e-9669c9da3637 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "dev-mapper-luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:47:33 -0500 (0:00:00.677) 0:07:25.873 ******** fatal: [managed-node1]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-79e8fafe-36db-4db0-bb7e-9669c9da3637' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Friday 17 January 2025 04:47:38 -0500 (0:00:04.376) 0:07:30.249 ******** fatal: [managed-node1]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'lvm', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-79e8fafe-36db-4db0-bb7e-9669c9da3637' in safe mode due to encryption removal"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:47:38 -0500 (0:00:00.081) 0:07:30.330 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2d79e8fafe\x2d36db\x2d4db0\x2dbb7e\x2d9669c9da3637.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "name": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "dev-mapper-luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 17 January 2025 04:47:38 -0500 (0:00:00.613) 0:07:30.944 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 17 January 2025 04:47:38 -0500 (0:00:00.048) 0:07:30.993 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 17 January 2025 04:47:39 -0500 (0:00:00.053) 0:07:31.047 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 January 2025 04:47:39 -0500 (0:00:00.039) 0:07:31.087 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107241.2150102, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1737107241.2150102, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1737107241.2150102, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "179567899", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 January 2025 04:47:39 -0500 (0:00:00.435) 0:07:31.522 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:446 Friday 17 January 2025 04:47:39 -0500 (0:00:00.114) 0:07:31.637 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:47:39 -0500 (0:00:00.178) 0:07:31.816 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:47:39 -0500 (0:00:00.089) 0:07:31.906 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:47:40 -0500 (0:00:00.144) 0:07:32.050 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:47:40 -0500 (0:00:00.156) 0:07:32.206 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:47:40 -0500 (0:00:00.069) 0:07:32.276 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:47:40 -0500 (0:00:00.058) 0:07:32.334 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:47:40 -0500 (0:00:00.098) 0:07:32.433 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:47:40 -0500 (0:00:00.133) 0:07:32.566 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:47:40 -0500 (0:00:00.271) 0:07:32.838 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:47:44 -0500 (0:00:04.102) 0:07:36.941 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:47:44 -0500 (0:00:00.070) 0:07:37.012 ******** ok: [managed-node1] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:47:45 -0500 (0:00:00.074) 0:07:37.086 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:47:49 -0500 (0:00:04.480) 0:07:41.567 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:47:49 -0500 (0:00:00.162) 0:07:41.729 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:47:49 -0500 (0:00:00.113) 0:07:41.843 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:47:49 -0500 (0:00:00.113) 0:07:41.956 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:47:49 -0500 (0:00:00.055) 0:07:42.011 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:47:51 -0500 (0:00:01.018) 0:07:43.029 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service": { "name": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:47:52 -0500 (0:00:01.114) 0:07:44.144 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:47:52 -0500 (0:00:00.053) 0:07:44.198 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2d79e8fafe\x2d36db\x2d4db0\x2dbb7e\x2d9669c9da3637.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "name": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-mapper-foo\\x2dtest1.device systemd-journald.socket systemd-readahead-collect.service systemd-readahead-replay.service cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-79e8fafe-36db-4db0-bb7e-9669c9da3637 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-79e8fafe-36db-4db0-bb7e-9669c9da3637 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "dev-mapper-luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:47:52 -0500 (0:00:00.555) 0:07:44.753 ******** changed: [managed-node1] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:48:57 -0500 (0:01:04.950) 0:08:49.703 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:48:57 -0500 (0:00:00.047) 0:08:49.751 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107181.0259469, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ced3592ae7b2c54427438b85f160a5132d55fc74", "ctime": 1737107181.0229468, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737107181.0229468, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:48:58 -0500 (0:00:00.402) 0:08:50.153 ******** ok: [managed-node1] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:48:58 -0500 (0:00:00.433) 0:08:50.587 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2d79e8fafe\x2d36db\x2d4db0\x2dbb7e\x2d9669c9da3637.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "name": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "dev-mapper-luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:48:59 -0500 (0:00:00.650) 0:08:51.238 ******** ok: [managed-node1] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:48:59 -0500 (0:00:00.116) 0:08:51.354 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:48:59 -0500 (0:00:00.112) 0:08:51.467 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:48:59 -0500 (0:00:00.066) 0:08:51.533 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-79e8fafe-36db-4db0-bb7e-9669c9da3637" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:49:00 -0500 (0:00:00.526) 0:08:52.059 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:49:00 -0500 (0:00:00.576) 0:08:52.636 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/foo-test1', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:49:01 -0500 (0:00:00.450) 0:08:53.086 ******** skipping: [managed-node1] => (item={u'src': u'/dev/mapper/foo-test1', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:49:01 -0500 (0:00:00.075) 0:08:53.162 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:49:01 -0500 (0:00:00.532) 0:08:53.694 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107184.5649507, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "36a7767fecac79f015a3f808ca7d6b6631573205", "ctime": 1737107182.6249485, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263663, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1737107182.6229486, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "18446744072031200077", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:49:02 -0500 (0:00:00.372) 0:08:54.067 ******** changed: [managed-node1] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-79e8fafe-36db-4db0-bb7e-9669c9da3637', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:49:02 -0500 (0:00:00.425) 0:08:54.492 ******** ok: [managed-node1] TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:462 Friday 17 January 2025 04:49:04 -0500 (0:00:01.790) 0:08:56.283 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:49:04 -0500 (0:00:00.118) 0:08:56.402 ******** ok: [managed-node1] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:49:04 -0500 (0:00:00.072) 0:08:56.474 ******** skipping: [managed-node1] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:49:04 -0500 (0:00:00.054) 0:08:56.529 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "b53a758e-eb56-4307-a860-efbfa88f36ec" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "q5PvpB-4kw5-zuiW-X5c3-3C9L-v0UQ-y98OeM" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:49:04 -0500 (0:00:00.403) 0:08:56.932 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.004016", "end": "2025-01-17 04:49:05.238263", "rc": 0, "start": "2025-01-17 04:49:05.234247" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:49:05 -0500 (0:00:00.412) 0:08:57.345 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002829", "end": "2025-01-17 04:49:05.630980", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:49:05.628151" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:49:05 -0500 (0:00:00.416) 0:08:57.761 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node1 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 January 2025 04:49:05 -0500 (0:00:00.107) 0:08:57.869 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 January 2025 04:49:05 -0500 (0:00:00.038) 0:08:57.907 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.018810", "end": "2025-01-17 04:49:06.167235", "rc": 0, "start": "2025-01-17 04:49:06.148425" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 January 2025 04:49:06 -0500 (0:00:00.345) 0:08:58.253 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 January 2025 04:49:06 -0500 (0:00:00.050) 0:08:58.303 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 January 2025 04:49:06 -0500 (0:00:00.147) 0:08:58.451 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 January 2025 04:49:06 -0500 (0:00:00.049) 0:08:58.500 ******** ok: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 January 2025 04:49:06 -0500 (0:00:00.366) 0:08:58.867 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 January 2025 04:49:06 -0500 (0:00:00.069) 0:08:58.936 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 January 2025 04:49:06 -0500 (0:00:00.069) 0:08:59.006 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 January 2025 04:49:07 -0500 (0:00:00.076) 0:08:59.082 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 January 2025 04:49:07 -0500 (0:00:00.069) 0:08:59.152 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 January 2025 04:49:07 -0500 (0:00:00.070) 0:08:59.223 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Friday 17 January 2025 04:49:07 -0500 (0:00:00.064) 0:08:59.287 ******** ok: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Friday 17 January 2025 04:49:07 -0500 (0:00:00.087) 0:08:59.375 ******** ok: [managed-node1] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.46.65 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Friday 17 January 2025 04:49:07 -0500 (0:00:00.298) 0:08:59.674 ******** skipping: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Friday 17 January 2025 04:49:07 -0500 (0:00:00.067) 0:08:59.741 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node1 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 January 2025 04:49:07 -0500 (0:00:00.129) 0:08:59.871 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 January 2025 04:49:07 -0500 (0:00:00.052) 0:08:59.923 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 January 2025 04:49:07 -0500 (0:00:00.044) 0:08:59.968 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 January 2025 04:49:07 -0500 (0:00:00.046) 0:09:00.014 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 January 2025 04:49:08 -0500 (0:00:00.050) 0:09:00.065 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 January 2025 04:49:08 -0500 (0:00:00.051) 0:09:00.116 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 January 2025 04:49:08 -0500 (0:00:00.039) 0:09:00.155 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 January 2025 04:49:08 -0500 (0:00:00.038) 0:09:00.194 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 January 2025 04:49:08 -0500 (0:00:00.038) 0:09:00.233 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 January 2025 04:49:08 -0500 (0:00:00.038) 0:09:00.272 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 January 2025 04:49:08 -0500 (0:00:00.037) 0:09:00.309 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Friday 17 January 2025 04:49:08 -0500 (0:00:00.041) 0:09:00.351 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node1 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 January 2025 04:49:08 -0500 (0:00:00.079) 0:09:00.430 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node1 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Friday 17 January 2025 04:49:08 -0500 (0:00:00.095) 0:09:00.526 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Friday 17 January 2025 04:49:08 -0500 (0:00:00.068) 0:09:00.595 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Friday 17 January 2025 04:49:08 -0500 (0:00:00.058) 0:09:00.653 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Friday 17 January 2025 04:49:08 -0500 (0:00:00.047) 0:09:00.701 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Friday 17 January 2025 04:49:08 -0500 (0:00:00.059) 0:09:00.761 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Friday 17 January 2025 04:49:08 -0500 (0:00:00.060) 0:09:00.821 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Friday 17 January 2025 04:49:08 -0500 (0:00:00.058) 0:09:00.880 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Friday 17 January 2025 04:49:08 -0500 (0:00:00.049) 0:09:00.929 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node1 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 January 2025 04:49:08 -0500 (0:00:00.086) 0:09:01.015 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node1 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Friday 17 January 2025 04:49:09 -0500 (0:00:00.082) 0:09:01.097 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Friday 17 January 2025 04:49:09 -0500 (0:00:00.038) 0:09:01.135 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Friday 17 January 2025 04:49:09 -0500 (0:00:00.037) 0:09:01.173 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Friday 17 January 2025 04:49:09 -0500 (0:00:00.038) 0:09:01.211 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Friday 17 January 2025 04:49:09 -0500 (0:00:00.037) 0:09:01.249 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 January 2025 04:49:09 -0500 (0:00:00.087) 0:09:01.337 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 January 2025 04:49:09 -0500 (0:00:00.051) 0:09:01.388 ******** skipping: [managed-node1] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 January 2025 04:49:09 -0500 (0:00:00.061) 0:09:01.450 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node1 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Friday 17 January 2025 04:49:09 -0500 (0:00:00.113) 0:09:01.563 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Friday 17 January 2025 04:49:09 -0500 (0:00:00.069) 0:09:01.633 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Friday 17 January 2025 04:49:09 -0500 (0:00:00.070) 0:09:01.704 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Friday 17 January 2025 04:49:09 -0500 (0:00:00.055) 0:09:01.759 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Friday 17 January 2025 04:49:09 -0500 (0:00:00.060) 0:09:01.819 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Friday 17 January 2025 04:49:09 -0500 (0:00:00.063) 0:09:01.882 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 January 2025 04:49:09 -0500 (0:00:00.074) 0:09:01.957 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Friday 17 January 2025 04:49:10 -0500 (0:00:00.150) 0:09:02.108 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node1 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 January 2025 04:49:10 -0500 (0:00:00.131) 0:09:02.239 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node1 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Friday 17 January 2025 04:49:10 -0500 (0:00:00.120) 0:09:02.360 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Friday 17 January 2025 04:49:10 -0500 (0:00:00.058) 0:09:02.418 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Friday 17 January 2025 04:49:10 -0500 (0:00:00.057) 0:09:02.476 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Friday 17 January 2025 04:49:10 -0500 (0:00:00.087) 0:09:02.563 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Friday 17 January 2025 04:49:10 -0500 (0:00:00.071) 0:09:02.634 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Friday 17 January 2025 04:49:10 -0500 (0:00:00.067) 0:09:02.702 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Friday 17 January 2025 04:49:10 -0500 (0:00:00.064) 0:09:02.767 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Friday 17 January 2025 04:49:10 -0500 (0:00:00.059) 0:09:02.826 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node1 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 January 2025 04:49:10 -0500 (0:00:00.166) 0:09:02.993 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Friday 17 January 2025 04:49:11 -0500 (0:00:00.107) 0:09:03.100 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 January 2025 04:49:11 -0500 (0:00:00.073) 0:09:03.174 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Friday 17 January 2025 04:49:11 -0500 (0:00:00.109) 0:09:03.284 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Friday 17 January 2025 04:49:11 -0500 (0:00:00.103) 0:09:03.387 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Friday 17 January 2025 04:49:11 -0500 (0:00:00.087) 0:09:03.475 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Friday 17 January 2025 04:49:11 -0500 (0:00:00.057) 0:09:03.532 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 January 2025 04:49:11 -0500 (0:00:00.052) 0:09:03.585 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:49:11 -0500 (0:00:00.086) 0:09:03.672 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:49:11 -0500 (0:00:00.045) 0:09:03.718 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:49:11 -0500 (0:00:00.260) 0:09:03.979 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:49:12 -0500 (0:00:00.069) 0:09:04.049 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:49:12 -0500 (0:00:00.071) 0:09:04.120 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:49:12 -0500 (0:00:00.059) 0:09:04.179 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:49:12 -0500 (0:00:00.073) 0:09:04.252 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:49:12 -0500 (0:00:00.059) 0:09:04.312 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:49:12 -0500 (0:00:00.060) 0:09:04.373 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:49:12 -0500 (0:00:00.063) 0:09:04.437 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:49:12 -0500 (0:00:00.070) 0:09:04.507 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:49:12 -0500 (0:00:00.059) 0:09:04.567 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:49:12 -0500 (0:00:00.078) 0:09:04.645 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:49:12 -0500 (0:00:00.056) 0:09:04.702 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:49:12 -0500 (0:00:00.094) 0:09:04.797 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:49:12 -0500 (0:00:00.102) 0:09:04.899 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:49:12 -0500 (0:00:00.056) 0:09:04.956 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:49:12 -0500 (0:00:00.055) 0:09:05.011 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:49:13 -0500 (0:00:00.072) 0:09:05.084 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:49:13 -0500 (0:00:00.061) 0:09:05.145 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:49:13 -0500 (0:00:00.082) 0:09:05.227 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:49:13 -0500 (0:00:00.099) 0:09:05.327 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107337.5621119, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737107337.5621119, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 428793, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737107337.5621119, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:49:13 -0500 (0:00:00.399) 0:09:05.727 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:49:13 -0500 (0:00:00.070) 0:09:05.797 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:49:13 -0500 (0:00:00.086) 0:09:05.884 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:49:13 -0500 (0:00:00.106) 0:09:05.991 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:49:14 -0500 (0:00:00.080) 0:09:06.072 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:49:14 -0500 (0:00:00.057) 0:09:06.130 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:49:14 -0500 (0:00:00.075) 0:09:06.205 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:49:14 -0500 (0:00:00.071) 0:09:06.276 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:49:15 -0500 (0:00:00.757) 0:09:07.034 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:49:15 -0500 (0:00:00.076) 0:09:07.110 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:49:15 -0500 (0:00:00.062) 0:09:07.173 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:49:15 -0500 (0:00:00.068) 0:09:07.242 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:49:15 -0500 (0:00:00.047) 0:09:07.290 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:49:15 -0500 (0:00:00.045) 0:09:07.335 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:49:15 -0500 (0:00:00.055) 0:09:07.391 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:49:15 -0500 (0:00:00.041) 0:09:07.432 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:49:15 -0500 (0:00:00.038) 0:09:07.471 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:49:15 -0500 (0:00:00.047) 0:09:07.519 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:49:15 -0500 (0:00:00.044) 0:09:07.563 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:49:15 -0500 (0:00:00.035) 0:09:07.598 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:49:15 -0500 (0:00:00.035) 0:09:07.634 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:49:15 -0500 (0:00:00.039) 0:09:07.673 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:49:15 -0500 (0:00:00.036) 0:09:07.710 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:49:15 -0500 (0:00:00.035) 0:09:07.746 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:49:15 -0500 (0:00:00.042) 0:09:07.789 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:49:15 -0500 (0:00:00.049) 0:09:07.838 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:49:15 -0500 (0:00:00.057) 0:09:07.896 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:49:15 -0500 (0:00:00.086) 0:09:07.982 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:49:16 -0500 (0:00:00.094) 0:09:08.077 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:49:16 -0500 (0:00:00.058) 0:09:08.135 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:49:16 -0500 (0:00:00.117) 0:09:08.252 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:49:16 -0500 (0:00:00.059) 0:09:08.312 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:49:16 -0500 (0:00:00.098) 0:09:08.411 ******** ok: [managed-node1] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:49:16 -0500 (0:00:00.453) 0:09:08.864 ******** ok: [managed-node1] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:49:17 -0500 (0:00:00.425) 0:09:09.290 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:49:17 -0500 (0:00:00.066) 0:09:09.356 ******** ok: [managed-node1] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:49:17 -0500 (0:00:00.074) 0:09:09.431 ******** ok: [managed-node1] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:49:17 -0500 (0:00:00.511) 0:09:09.942 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:49:18 -0500 (0:00:00.117) 0:09:10.059 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:49:18 -0500 (0:00:00.089) 0:09:10.149 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:49:18 -0500 (0:00:00.066) 0:09:10.216 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:49:18 -0500 (0:00:00.074) 0:09:10.291 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:49:18 -0500 (0:00:00.067) 0:09:10.358 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:49:18 -0500 (0:00:00.059) 0:09:10.418 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:49:18 -0500 (0:00:00.062) 0:09:10.480 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:49:18 -0500 (0:00:00.095) 0:09:10.575 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:49:18 -0500 (0:00:00.078) 0:09:10.654 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:49:18 -0500 (0:00:00.066) 0:09:10.720 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:49:18 -0500 (0:00:00.058) 0:09:10.778 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:49:18 -0500 (0:00:00.074) 0:09:10.853 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:49:18 -0500 (0:00:00.077) 0:09:10.930 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:49:18 -0500 (0:00:00.075) 0:09:11.005 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:49:19 -0500 (0:00:00.065) 0:09:11.071 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:49:19 -0500 (0:00:00.066) 0:09:11.138 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:49:19 -0500 (0:00:00.058) 0:09:11.196 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:49:19 -0500 (0:00:00.056) 0:09:11.253 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:49:19 -0500 (0:00:00.061) 0:09:11.315 ******** ok: [managed-node1] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:49:19 -0500 (0:00:00.063) 0:09:11.378 ******** ok: [managed-node1] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:49:19 -0500 (0:00:00.061) 0:09:11.440 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:49:19 -0500 (0:00:00.080) 0:09:11.520 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.020022", "end": "2025-01-17 04:49:19.939900", "rc": 0, "start": "2025-01-17 04:49:19.919878" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:49:20 -0500 (0:00:00.533) 0:09:12.054 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:49:20 -0500 (0:00:00.075) 0:09:12.129 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:49:20 -0500 (0:00:00.082) 0:09:12.212 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:49:20 -0500 (0:00:00.078) 0:09:12.291 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:49:20 -0500 (0:00:00.082) 0:09:12.374 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:49:20 -0500 (0:00:00.069) 0:09:12.443 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:49:20 -0500 (0:00:00.073) 0:09:12.516 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:49:20 -0500 (0:00:00.146) 0:09:12.663 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:49:20 -0500 (0:00:00.051) 0:09:12.714 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 17 January 2025 04:49:20 -0500 (0:00:00.097) 0:09:12.811 ******** changed: [managed-node1] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:468 Friday 17 January 2025 04:49:21 -0500 (0:00:00.461) 0:09:13.273 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node1 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 17 January 2025 04:49:21 -0500 (0:00:00.084) 0:09:13.357 ******** ok: [managed-node1] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 17 January 2025 04:49:21 -0500 (0:00:00.075) 0:09:13.433 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:49:21 -0500 (0:00:00.108) 0:09:13.542 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:49:21 -0500 (0:00:00.108) 0:09:13.650 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:49:21 -0500 (0:00:00.097) 0:09:13.748 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:49:21 -0500 (0:00:00.189) 0:09:13.937 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:49:21 -0500 (0:00:00.065) 0:09:14.003 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:49:22 -0500 (0:00:00.065) 0:09:14.069 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:49:22 -0500 (0:00:00.083) 0:09:14.153 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:49:22 -0500 (0:00:00.060) 0:09:14.213 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:49:22 -0500 (0:00:00.277) 0:09:14.491 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:49:26 -0500 (0:00:04.050) 0:09:18.541 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:49:26 -0500 (0:00:00.052) 0:09:18.593 ******** ok: [managed-node1] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:49:26 -0500 (0:00:00.047) 0:09:18.641 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:49:30 -0500 (0:00:04.129) 0:09:22.770 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:49:30 -0500 (0:00:00.104) 0:09:22.874 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:49:30 -0500 (0:00:00.050) 0:09:22.925 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:49:30 -0500 (0:00:00.044) 0:09:22.970 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:49:30 -0500 (0:00:00.034) 0:09:23.004 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:49:31 -0500 (0:00:00.799) 0:09:23.804 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service": { "name": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:49:32 -0500 (0:00:00.988) 0:09:24.792 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:49:32 -0500 (0:00:00.081) 0:09:24.873 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2d79e8fafe\x2d36db\x2d4db0\x2dbb7e\x2d9669c9da3637.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "name": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-collect.service dev-mapper-foo\\x2dtest1.device system-systemd\\x2dcryptsetup.slice systemd-readahead-replay.service systemd-journald.socket cryptsetup-pre.target", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-79e8fafe-36db-4db0-bb7e-9669c9da3637", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-79e8fafe-36db-4db0-bb7e-9669c9da3637 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-79e8fafe-36db-4db0-bb7e-9669c9da3637 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:49:33 -0500 (0:00:00.662) 0:09:25.536 ******** fatal: [managed-node1]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Friday 17 January 2025 04:49:37 -0500 (0:00:04.381) 0:09:29.918 ******** fatal: [managed-node1]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'lvm', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'test1' in safe mode due to adding encryption"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:49:37 -0500 (0:00:00.078) 0:09:29.997 ******** changed: [managed-node1] => (item=systemd-cryptsetup@luks\x2d79e8fafe\x2d36db\x2d4db0\x2dbb7e\x2d9669c9da3637.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "name": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d79e8fafe\\x2d36db\\x2d4db0\\x2dbb7e\\x2d9669c9da3637.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 17 January 2025 04:49:38 -0500 (0:00:00.676) 0:09:30.673 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 17 January 2025 04:49:38 -0500 (0:00:00.080) 0:09:30.753 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 17 January 2025 04:49:38 -0500 (0:00:00.123) 0:09:30.877 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 17 January 2025 04:49:38 -0500 (0:00:00.095) 0:09:30.972 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107361.1871367, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1737107361.1871367, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1737107361.1871367, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744072273819642", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 17 January 2025 04:49:39 -0500 (0:00:00.663) 0:09:31.635 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:491 Friday 17 January 2025 04:49:39 -0500 (0:00:00.119) 0:09:31.755 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:49:40 -0500 (0:00:00.420) 0:09:32.176 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:49:40 -0500 (0:00:00.117) 0:09:32.293 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:49:40 -0500 (0:00:00.072) 0:09:32.365 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:49:40 -0500 (0:00:00.130) 0:09:32.496 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:49:40 -0500 (0:00:00.088) 0:09:32.584 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:49:40 -0500 (0:00:00.086) 0:09:32.671 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:49:40 -0500 (0:00:00.089) 0:09:32.760 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:49:40 -0500 (0:00:00.139) 0:09:32.900 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:49:41 -0500 (0:00:00.177) 0:09:33.077 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:49:45 -0500 (0:00:04.137) 0:09:37.214 ******** ok: [managed-node1] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:49:45 -0500 (0:00:00.170) 0:09:37.385 ******** ok: [managed-node1] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:49:45 -0500 (0:00:00.063) 0:09:37.459 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:49:49 -0500 (0:00:04.234) 0:09:41.694 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:49:49 -0500 (0:00:00.103) 0:09:41.797 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:49:49 -0500 (0:00:00.051) 0:09:41.849 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:49:49 -0500 (0:00:00.055) 0:09:41.904 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:49:49 -0500 (0:00:00.059) 0:09:41.963 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:49:50 -0500 (0:00:00.760) 0:09:42.724 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:49:51 -0500 (0:00:01.093) 0:09:43.817 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:49:51 -0500 (0:00:00.093) 0:09:43.911 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:49:51 -0500 (0:00:00.078) 0:09:43.990 ******** changed: [managed-node1] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:50:02 -0500 (0:00:10.827) 0:09:54.817 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:50:02 -0500 (0:00:00.064) 0:09:54.882 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107340.9461155, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "3fceedeef6c619b69ada96279531b69ed89734ba", "ctime": 1737107340.9441154, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737107340.9441154, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1279, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:50:03 -0500 (0:00:00.468) 0:09:55.351 ******** ok: [managed-node1] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:50:03 -0500 (0:00:00.419) 0:09:55.770 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:50:03 -0500 (0:00:00.033) 0:09:55.804 ******** ok: [managed-node1] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:50:03 -0500 (0:00:00.052) 0:09:55.856 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:50:03 -0500 (0:00:00.047) 0:09:55.904 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:50:03 -0500 (0:00:00.040) 0:09:55.944 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:50:04 -0500 (0:00:00.455) 0:09:56.400 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:50:04 -0500 (0:00:00.536) 0:09:56.937 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:50:05 -0500 (0:00:00.491) 0:09:57.428 ******** skipping: [managed-node1] => (item={u'src': u'/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:50:05 -0500 (0:00:00.079) 0:09:57.508 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:50:06 -0500 (0:00:00.603) 0:09:58.111 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107345.6301203, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1737107342.370117, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263661, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1737107342.370117, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744072031200442", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:50:06 -0500 (0:00:00.438) 0:09:58.550 ******** changed: [managed-node1] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:50:06 -0500 (0:00:00.383) 0:09:58.934 ******** ok: [managed-node1] TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:507 Friday 17 January 2025 04:50:07 -0500 (0:00:00.906) 0:09:59.840 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:50:07 -0500 (0:00:00.163) 0:10:00.004 ******** ok: [managed-node1] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:50:08 -0500 (0:00:00.192) 0:10:00.196 ******** skipping: [managed-node1] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:50:08 -0500 (0:00:00.046) 0:10:00.243 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "be6ebeb9-2a48-4dca-b637-85f4c4e7e728" }, "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "size": "4G", "type": "crypt", "uuid": "aee6b543-1e63-49be-b9ad-465f4e6c5eaa" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "q5PvpB-4kw5-zuiW-X5c3-3C9L-v0UQ-y98OeM" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:50:08 -0500 (0:00:00.403) 0:10:00.646 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002932", "end": "2025-01-17 04:50:08.946840", "rc": 0, "start": "2025-01-17 04:50:08.943908" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:50:09 -0500 (0:00:00.409) 0:10:01.056 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003203", "end": "2025-01-17 04:50:09.360777", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:50:09.357574" } STDOUT: luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:50:09 -0500 (0:00:00.405) 0:10:01.462 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node1 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 17 January 2025 04:50:09 -0500 (0:00:00.115) 0:10:01.578 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 17 January 2025 04:50:09 -0500 (0:00:00.076) 0:10:01.654 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.019992", "end": "2025-01-17 04:50:09.986823", "rc": 0, "start": "2025-01-17 04:50:09.966831" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 17 January 2025 04:50:10 -0500 (0:00:00.433) 0:10:02.087 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 17 January 2025 04:50:10 -0500 (0:00:00.073) 0:10:02.161 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 17 January 2025 04:50:10 -0500 (0:00:00.127) 0:10:02.288 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 17 January 2025 04:50:10 -0500 (0:00:00.086) 0:10:02.375 ******** ok: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 17 January 2025 04:50:10 -0500 (0:00:00.438) 0:10:02.813 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 17 January 2025 04:50:10 -0500 (0:00:00.074) 0:10:02.888 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 17 January 2025 04:50:10 -0500 (0:00:00.086) 0:10:02.974 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 17 January 2025 04:50:11 -0500 (0:00:00.077) 0:10:03.052 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 17 January 2025 04:50:11 -0500 (0:00:00.069) 0:10:03.121 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 17 January 2025 04:50:11 -0500 (0:00:00.072) 0:10:03.194 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Friday 17 January 2025 04:50:11 -0500 (0:00:00.062) 0:10:03.256 ******** ok: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Friday 17 January 2025 04:50:11 -0500 (0:00:00.077) 0:10:03.333 ******** ok: [managed-node1] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.46.65 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Friday 17 January 2025 04:50:11 -0500 (0:00:00.362) 0:10:03.695 ******** skipping: [managed-node1] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Friday 17 January 2025 04:50:11 -0500 (0:00:00.080) 0:10:03.776 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node1 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 17 January 2025 04:50:11 -0500 (0:00:00.129) 0:10:03.906 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 17 January 2025 04:50:11 -0500 (0:00:00.061) 0:10:03.967 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 17 January 2025 04:50:12 -0500 (0:00:00.073) 0:10:04.040 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 17 January 2025 04:50:12 -0500 (0:00:00.063) 0:10:04.104 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 17 January 2025 04:50:12 -0500 (0:00:00.067) 0:10:04.171 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 17 January 2025 04:50:12 -0500 (0:00:00.089) 0:10:04.261 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 17 January 2025 04:50:12 -0500 (0:00:00.098) 0:10:04.359 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 17 January 2025 04:50:12 -0500 (0:00:00.095) 0:10:04.455 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 17 January 2025 04:50:12 -0500 (0:00:00.087) 0:10:04.543 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 17 January 2025 04:50:12 -0500 (0:00:00.082) 0:10:04.626 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 17 January 2025 04:50:12 -0500 (0:00:00.057) 0:10:04.683 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Friday 17 January 2025 04:50:12 -0500 (0:00:00.098) 0:10:04.781 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node1 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 17 January 2025 04:50:12 -0500 (0:00:00.162) 0:10:04.943 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node1 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Friday 17 January 2025 04:50:13 -0500 (0:00:00.161) 0:10:05.104 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Friday 17 January 2025 04:50:13 -0500 (0:00:00.088) 0:10:05.193 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Friday 17 January 2025 04:50:13 -0500 (0:00:00.109) 0:10:05.302 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Friday 17 January 2025 04:50:13 -0500 (0:00:00.067) 0:10:05.370 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Friday 17 January 2025 04:50:13 -0500 (0:00:00.062) 0:10:05.433 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Friday 17 January 2025 04:50:13 -0500 (0:00:00.067) 0:10:05.501 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Friday 17 January 2025 04:50:13 -0500 (0:00:00.057) 0:10:05.559 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Friday 17 January 2025 04:50:13 -0500 (0:00:00.078) 0:10:05.637 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node1 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 17 January 2025 04:50:13 -0500 (0:00:00.226) 0:10:05.864 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node1 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Friday 17 January 2025 04:50:14 -0500 (0:00:00.192) 0:10:06.056 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Friday 17 January 2025 04:50:14 -0500 (0:00:00.074) 0:10:06.131 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Friday 17 January 2025 04:50:14 -0500 (0:00:00.075) 0:10:06.206 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Friday 17 January 2025 04:50:14 -0500 (0:00:00.070) 0:10:06.276 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Friday 17 January 2025 04:50:14 -0500 (0:00:00.080) 0:10:06.357 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node1 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 17 January 2025 04:50:14 -0500 (0:00:00.137) 0:10:06.494 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 17 January 2025 04:50:14 -0500 (0:00:00.067) 0:10:06.562 ******** skipping: [managed-node1] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 17 January 2025 04:50:14 -0500 (0:00:00.065) 0:10:06.627 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node1 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Friday 17 January 2025 04:50:14 -0500 (0:00:00.123) 0:10:06.750 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Friday 17 January 2025 04:50:14 -0500 (0:00:00.079) 0:10:06.830 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Friday 17 January 2025 04:50:14 -0500 (0:00:00.099) 0:10:06.930 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Friday 17 January 2025 04:50:14 -0500 (0:00:00.075) 0:10:07.006 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Friday 17 January 2025 04:50:15 -0500 (0:00:00.072) 0:10:07.078 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Friday 17 January 2025 04:50:15 -0500 (0:00:00.070) 0:10:07.148 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 17 January 2025 04:50:15 -0500 (0:00:00.104) 0:10:07.252 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Friday 17 January 2025 04:50:15 -0500 (0:00:00.129) 0:10:07.382 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node1 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 17 January 2025 04:50:15 -0500 (0:00:00.272) 0:10:07.654 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node1 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Friday 17 January 2025 04:50:15 -0500 (0:00:00.128) 0:10:07.783 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Friday 17 January 2025 04:50:15 -0500 (0:00:00.087) 0:10:07.870 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Friday 17 January 2025 04:50:15 -0500 (0:00:00.057) 0:10:07.928 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Friday 17 January 2025 04:50:15 -0500 (0:00:00.058) 0:10:07.986 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Friday 17 January 2025 04:50:16 -0500 (0:00:00.057) 0:10:08.044 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Friday 17 January 2025 04:50:16 -0500 (0:00:00.059) 0:10:08.104 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Friday 17 January 2025 04:50:16 -0500 (0:00:00.058) 0:10:08.162 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Friday 17 January 2025 04:50:16 -0500 (0:00:00.061) 0:10:08.224 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node1 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 17 January 2025 04:50:16 -0500 (0:00:00.150) 0:10:08.375 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Friday 17 January 2025 04:50:16 -0500 (0:00:00.056) 0:10:08.432 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 17 January 2025 04:50:16 -0500 (0:00:00.053) 0:10:08.485 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Friday 17 January 2025 04:50:16 -0500 (0:00:00.062) 0:10:08.547 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Friday 17 January 2025 04:50:16 -0500 (0:00:00.058) 0:10:08.606 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Friday 17 January 2025 04:50:16 -0500 (0:00:00.057) 0:10:08.664 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Friday 17 January 2025 04:50:16 -0500 (0:00:00.104) 0:10:08.769 ******** ok: [managed-node1] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 17 January 2025 04:50:16 -0500 (0:00:00.057) 0:10:08.827 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:50:16 -0500 (0:00:00.175) 0:10:09.003 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:50:17 -0500 (0:00:00.091) 0:10:09.095 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:50:17 -0500 (0:00:00.473) 0:10:09.568 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:50:17 -0500 (0:00:00.121) 0:10:09.689 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:50:17 -0500 (0:00:00.123) 0:10:09.813 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:50:17 -0500 (0:00:00.091) 0:10:09.905 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:50:18 -0500 (0:00:00.119) 0:10:10.024 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:50:18 -0500 (0:00:00.121) 0:10:10.146 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:50:18 -0500 (0:00:00.126) 0:10:10.272 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:50:18 -0500 (0:00:00.082) 0:10:10.355 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:50:18 -0500 (0:00:00.086) 0:10:10.441 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:50:18 -0500 (0:00:00.184) 0:10:10.626 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:50:18 -0500 (0:00:00.059) 0:10:10.685 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:50:18 -0500 (0:00:00.060) 0:10:10.745 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:50:18 -0500 (0:00:00.098) 0:10:10.844 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:50:18 -0500 (0:00:00.067) 0:10:10.911 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:50:18 -0500 (0:00:00.083) 0:10:10.995 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:50:19 -0500 (0:00:00.069) 0:10:11.064 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:50:19 -0500 (0:00:00.112) 0:10:11.177 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:50:19 -0500 (0:00:00.090) 0:10:11.268 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:50:19 -0500 (0:00:00.076) 0:10:11.344 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:50:19 -0500 (0:00:00.083) 0:10:11.427 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107402.45618, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737107402.45618, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 428793, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737107402.45618, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:50:19 -0500 (0:00:00.520) 0:10:11.948 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:50:20 -0500 (0:00:00.119) 0:10:12.067 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:50:20 -0500 (0:00:00.094) 0:10:12.162 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:50:20 -0500 (0:00:00.070) 0:10:12.232 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:50:20 -0500 (0:00:00.098) 0:10:12.330 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:50:20 -0500 (0:00:00.058) 0:10:12.389 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:50:20 -0500 (0:00:00.079) 0:10:12.468 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107402.5711803, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737107402.5711803, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 440606, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1737107402.5711803, "nlink": 1, "path": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:50:20 -0500 (0:00:00.494) 0:10:12.962 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:50:22 -0500 (0:00:01.084) 0:10:14.047 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.025935", "end": "2025-01-17 04:50:22.498644", "rc": 0, "start": "2025-01-17 04:50:22.472709" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 1a dd c3 07 68 b1 98 4d 3f fe db c0 ad 7a 82 48 cc 60 0a fc MK salt: 15 73 3c 20 ad 01 85 94 d7 8a 52 75 5d c4 ec 76 50 fc ff 75 b4 9a 75 67 37 a0 f0 0a c5 0c 24 1f MK iterations: 23239 UUID: be6ebeb9-2a48-4dca-b637-85f4c4e7e728 Key Slot 0: ENABLED Iterations: 371834 Salt: 9e 61 bb 78 de e8 61 c3 80 40 34 57 cd d5 c4 e2 ed e9 a3 5d 76 c3 f6 82 00 19 ea 08 8d dd 8d 5e Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:50:22 -0500 (0:00:00.643) 0:10:14.691 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:50:22 -0500 (0:00:00.147) 0:10:14.838 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:50:22 -0500 (0:00:00.112) 0:10:14.951 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:50:23 -0500 (0:00:00.082) 0:10:15.033 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:50:23 -0500 (0:00:00.127) 0:10:15.161 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:50:23 -0500 (0:00:00.079) 0:10:15.240 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:50:23 -0500 (0:00:00.122) 0:10:15.363 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:50:23 -0500 (0:00:00.103) 0:10:15.466 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:50:23 -0500 (0:00:00.149) 0:10:15.616 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:50:23 -0500 (0:00:00.108) 0:10:15.724 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:50:23 -0500 (0:00:00.103) 0:10:15.828 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:50:23 -0500 (0:00:00.084) 0:10:15.913 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:50:23 -0500 (0:00:00.080) 0:10:15.993 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:50:24 -0500 (0:00:00.075) 0:10:16.068 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:50:24 -0500 (0:00:00.088) 0:10:16.156 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:50:24 -0500 (0:00:00.068) 0:10:16.225 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:50:24 -0500 (0:00:00.060) 0:10:16.285 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:50:24 -0500 (0:00:00.058) 0:10:16.344 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:50:24 -0500 (0:00:00.056) 0:10:16.400 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:50:24 -0500 (0:00:00.049) 0:10:16.449 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:50:24 -0500 (0:00:00.048) 0:10:16.498 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:50:24 -0500 (0:00:00.046) 0:10:16.544 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:50:24 -0500 (0:00:00.037) 0:10:16.582 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:50:24 -0500 (0:00:00.052) 0:10:16.635 ******** ok: [managed-node1] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:50:24 -0500 (0:00:00.348) 0:10:16.983 ******** ok: [managed-node1] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:50:25 -0500 (0:00:00.374) 0:10:17.358 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:50:25 -0500 (0:00:00.052) 0:10:17.410 ******** ok: [managed-node1] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:50:25 -0500 (0:00:00.042) 0:10:17.453 ******** ok: [managed-node1] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:50:25 -0500 (0:00:00.415) 0:10:17.869 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:50:25 -0500 (0:00:00.061) 0:10:17.930 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:50:25 -0500 (0:00:00.063) 0:10:17.994 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:50:26 -0500 (0:00:00.066) 0:10:18.060 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:50:26 -0500 (0:00:00.069) 0:10:18.130 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:50:26 -0500 (0:00:00.060) 0:10:18.190 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:50:26 -0500 (0:00:00.063) 0:10:18.254 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:50:26 -0500 (0:00:00.059) 0:10:18.313 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:50:26 -0500 (0:00:00.058) 0:10:18.372 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:50:26 -0500 (0:00:00.058) 0:10:18.431 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:50:26 -0500 (0:00:00.058) 0:10:18.489 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:50:26 -0500 (0:00:00.058) 0:10:18.548 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:50:26 -0500 (0:00:00.065) 0:10:18.613 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:50:26 -0500 (0:00:00.058) 0:10:18.672 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:50:26 -0500 (0:00:00.080) 0:10:18.752 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:50:26 -0500 (0:00:00.057) 0:10:18.809 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:50:26 -0500 (0:00:00.047) 0:10:18.856 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:50:26 -0500 (0:00:00.049) 0:10:18.906 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:50:26 -0500 (0:00:00.052) 0:10:18.958 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:50:27 -0500 (0:00:00.117) 0:10:19.076 ******** ok: [managed-node1] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:50:27 -0500 (0:00:00.051) 0:10:19.127 ******** ok: [managed-node1] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:50:27 -0500 (0:00:00.061) 0:10:19.188 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:50:27 -0500 (0:00:00.081) 0:10:19.270 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.021143", "end": "2025-01-17 04:50:27.587863", "rc": 0, "start": "2025-01-17 04:50:27.566720" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:50:27 -0500 (0:00:00.432) 0:10:19.702 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:50:27 -0500 (0:00:00.095) 0:10:19.797 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:50:27 -0500 (0:00:00.077) 0:10:19.875 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:50:27 -0500 (0:00:00.063) 0:10:19.938 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:50:27 -0500 (0:00:00.070) 0:10:20.008 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:50:28 -0500 (0:00:00.069) 0:10:20.078 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:50:28 -0500 (0:00:00.063) 0:10:20.141 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:50:28 -0500 (0:00:00.058) 0:10:20.200 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:50:28 -0500 (0:00:00.053) 0:10:20.254 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:510 Friday 17 January 2025 04:50:28 -0500 (0:00:00.057) 0:10:20.312 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 17 January 2025 04:50:28 -0500 (0:00:00.196) 0:10:20.508 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 17 January 2025 04:50:28 -0500 (0:00:00.100) 0:10:20.609 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 17 January 2025 04:50:28 -0500 (0:00:00.077) 0:10:20.686 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node1] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 17 January 2025 04:50:28 -0500 (0:00:00.144) 0:10:20.831 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 17 January 2025 04:50:28 -0500 (0:00:00.066) 0:10:20.897 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 17 January 2025 04:50:28 -0500 (0:00:00.058) 0:10:20.956 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 17 January 2025 04:50:28 -0500 (0:00:00.062) 0:10:21.018 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 17 January 2025 04:50:29 -0500 (0:00:00.062) 0:10:21.080 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 17 January 2025 04:50:29 -0500 (0:00:00.136) 0:10:21.217 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 17 January 2025 04:50:33 -0500 (0:00:03.919) 0:10:25.136 ******** ok: [managed-node1] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 17 January 2025 04:50:33 -0500 (0:00:00.069) 0:10:25.206 ******** ok: [managed-node1] => { "storage_volumes": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 17 January 2025 04:50:33 -0500 (0:00:00.080) 0:10:25.286 ******** ok: [managed-node1] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Friday 17 January 2025 04:50:37 -0500 (0:00:04.229) 0:10:29.515 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 17 January 2025 04:50:37 -0500 (0:00:00.069) 0:10:29.585 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 17 January 2025 04:50:37 -0500 (0:00:00.041) 0:10:29.626 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 17 January 2025 04:50:37 -0500 (0:00:00.054) 0:10:29.680 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Friday 17 January 2025 04:50:37 -0500 (0:00:00.042) 0:10:29.723 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Friday 17 January 2025 04:50:38 -0500 (0:00:00.678) 0:10:30.402 ******** ok: [managed-node1] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Friday 17 January 2025 04:50:39 -0500 (0:00:01.022) 0:10:31.424 ******** ok: [managed-node1] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Friday 17 January 2025 04:50:39 -0500 (0:00:00.085) 0:10:31.510 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Friday 17 January 2025 04:50:39 -0500 (0:00:00.053) 0:10:31.564 ******** changed: [managed-node1] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=q5PvpB-4kw5-zuiW-X5c3-3C9L-v0UQ-y98OeM", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Friday 17 January 2025 04:51:14 -0500 (0:00:34.957) 0:11:06.522 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Friday 17 January 2025 04:51:14 -0500 (0:00:00.059) 0:11:06.581 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107405.3141832, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "419442f332ec520f83a0118d71199db153e858b4", "ctime": 1737107405.3111832, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263644, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1737107405.3111832, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744072031193646", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Friday 17 January 2025 04:51:15 -0500 (0:00:00.442) 0:11:07.024 ******** ok: [managed-node1] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Friday 17 January 2025 04:51:15 -0500 (0:00:00.446) 0:11:07.470 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Friday 17 January 2025 04:51:15 -0500 (0:00:00.054) 0:11:07.525 ******** ok: [managed-node1] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=q5PvpB-4kw5-zuiW-X5c3-3C9L-v0UQ-y98OeM", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Friday 17 January 2025 04:51:15 -0500 (0:00:00.082) 0:11:07.608 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Friday 17 January 2025 04:51:15 -0500 (0:00:00.060) 0:11:07.668 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=q5PvpB-4kw5-zuiW-X5c3-3C9L-v0UQ-y98OeM", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Friday 17 January 2025 04:51:15 -0500 (0:00:00.097) 0:11:07.766 ******** changed: [managed-node1] => (item={u'src': u'/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Friday 17 January 2025 04:51:16 -0500 (0:00:00.736) 0:11:08.503 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Friday 17 January 2025 04:51:17 -0500 (0:00:00.657) 0:11:09.160 ******** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Friday 17 January 2025 04:51:17 -0500 (0:00:00.095) 0:11:09.256 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Friday 17 January 2025 04:51:17 -0500 (0:00:00.054) 0:11:09.310 ******** ok: [managed-node1] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Friday 17 January 2025 04:51:17 -0500 (0:00:00.543) 0:11:09.854 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107409.3591874, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "7eb1e5f847d73afb5d5354b6d696884579a8fdb3", "ctime": 1737107406.8041847, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263663, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1737107406.8031847, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "18446744072031200636", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Friday 17 January 2025 04:51:18 -0500 (0:00:00.660) 0:11:10.515 ******** changed: [managed-node1] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-be6ebeb9-2a48-4dca-b637-85f4c4e7e728", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Friday 17 January 2025 04:51:19 -0500 (0:00:00.523) 0:11:11.039 ******** ok: [managed-node1] TASK [Verify role results] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:520 Friday 17 January 2025 04:51:20 -0500 (0:00:01.013) 0:11:12.052 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node1 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 17 January 2025 04:51:20 -0500 (0:00:00.159) 0:11:12.211 ******** skipping: [managed-node1] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 17 January 2025 04:51:20 -0500 (0:00:00.061) 0:11:12.272 ******** ok: [managed-node1] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=q5PvpB-4kw5-zuiW-X5c3-3C9L-v0UQ-y98OeM", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 17 January 2025 04:51:20 -0500 (0:00:00.094) 0:11:12.366 ******** ok: [managed-node1] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 17 January 2025 04:51:20 -0500 (0:00:00.529) 0:11:12.896 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002854", "end": "2025-01-17 04:51:21.361431", "rc": 0, "start": "2025-01-17 04:51:21.358577" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 17 January 2025 04:51:21 -0500 (0:00:00.591) 0:11:13.488 ******** ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002733", "end": "2025-01-17 04:51:21.776542", "failed_when_result": false, "rc": 0, "start": "2025-01-17 04:51:21.773809" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 17 January 2025 04:51:21 -0500 (0:00:00.416) 0:11:13.904 ******** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Friday 17 January 2025 04:51:21 -0500 (0:00:00.086) 0:11:13.991 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node1 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 17 January 2025 04:51:22 -0500 (0:00:00.184) 0:11:14.176 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 17 January 2025 04:51:22 -0500 (0:00:00.093) 0:11:14.269 ******** included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node1 included: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node1 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 17 January 2025 04:51:22 -0500 (0:00:00.371) 0:11:14.641 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 17 January 2025 04:51:22 -0500 (0:00:00.089) 0:11:14.731 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 17 January 2025 04:51:22 -0500 (0:00:00.142) 0:11:14.873 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Friday 17 January 2025 04:51:22 -0500 (0:00:00.120) 0:11:14.994 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Friday 17 January 2025 04:51:23 -0500 (0:00:00.091) 0:11:15.085 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Friday 17 January 2025 04:51:23 -0500 (0:00:00.104) 0:11:15.189 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Friday 17 January 2025 04:51:23 -0500 (0:00:00.148) 0:11:15.338 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Friday 17 January 2025 04:51:23 -0500 (0:00:00.083) 0:11:15.422 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Friday 17 January 2025 04:51:23 -0500 (0:00:00.111) 0:11:15.533 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Friday 17 January 2025 04:51:23 -0500 (0:00:00.106) 0:11:15.640 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Friday 17 January 2025 04:51:23 -0500 (0:00:00.083) 0:11:15.724 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 17 January 2025 04:51:23 -0500 (0:00:00.080) 0:11:15.804 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 17 January 2025 04:51:23 -0500 (0:00:00.102) 0:11:15.906 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 17 January 2025 04:51:23 -0500 (0:00:00.088) 0:11:15.995 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 17 January 2025 04:51:24 -0500 (0:00:00.067) 0:11:16.062 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 17 January 2025 04:51:24 -0500 (0:00:00.085) 0:11:16.148 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 17 January 2025 04:51:24 -0500 (0:00:00.107) 0:11:16.256 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 17 January 2025 04:51:24 -0500 (0:00:00.122) 0:11:16.378 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 17 January 2025 04:51:24 -0500 (0:00:00.213) 0:11:16.591 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 17 January 2025 04:51:24 -0500 (0:00:00.086) 0:11:16.678 ******** ok: [managed-node1] => { "changed": false, "stat": { "atime": 1737107474.324256, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1737107474.324256, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 28267, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1737107474.324256, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 17 January 2025 04:51:25 -0500 (0:00:00.679) 0:11:17.359 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 17 January 2025 04:51:25 -0500 (0:00:00.090) 0:11:17.449 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 17 January 2025 04:51:25 -0500 (0:00:00.067) 0:11:17.516 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 17 January 2025 04:51:25 -0500 (0:00:00.065) 0:11:17.582 ******** ok: [managed-node1] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 17 January 2025 04:51:25 -0500 (0:00:00.060) 0:11:17.643 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 17 January 2025 04:51:25 -0500 (0:00:00.066) 0:11:17.710 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 17 January 2025 04:51:25 -0500 (0:00:00.058) 0:11:17.768 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 17 January 2025 04:51:25 -0500 (0:00:00.078) 0:11:17.846 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 17 January 2025 04:51:26 -0500 (0:00:01.125) 0:11:18.972 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 17 January 2025 04:51:27 -0500 (0:00:00.103) 0:11:19.075 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 17 January 2025 04:51:27 -0500 (0:00:00.088) 0:11:19.164 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 17 January 2025 04:51:27 -0500 (0:00:00.090) 0:11:19.254 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 17 January 2025 04:51:27 -0500 (0:00:00.065) 0:11:19.319 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 17 January 2025 04:51:27 -0500 (0:00:00.099) 0:11:19.419 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Friday 17 January 2025 04:51:27 -0500 (0:00:00.086) 0:11:19.505 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Friday 17 January 2025 04:51:27 -0500 (0:00:00.073) 0:11:19.578 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Friday 17 January 2025 04:51:27 -0500 (0:00:00.103) 0:11:19.682 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Friday 17 January 2025 04:51:27 -0500 (0:00:00.117) 0:11:19.799 ******** ok: [managed-node1] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Friday 17 January 2025 04:51:27 -0500 (0:00:00.140) 0:11:19.940 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Friday 17 January 2025 04:51:28 -0500 (0:00:00.093) 0:11:20.033 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Friday 17 January 2025 04:51:28 -0500 (0:00:00.089) 0:11:20.123 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Friday 17 January 2025 04:51:28 -0500 (0:00:00.115) 0:11:20.239 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 17 January 2025 04:51:28 -0500 (0:00:00.057) 0:11:20.297 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 17 January 2025 04:51:28 -0500 (0:00:00.055) 0:11:20.352 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 17 January 2025 04:51:28 -0500 (0:00:00.063) 0:11:20.416 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 17 January 2025 04:51:28 -0500 (0:00:00.057) 0:11:20.473 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 17 January 2025 04:51:28 -0500 (0:00:00.063) 0:11:20.538 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 17 January 2025 04:51:28 -0500 (0:00:00.058) 0:11:20.596 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 17 January 2025 04:51:28 -0500 (0:00:00.075) 0:11:20.672 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 17 January 2025 04:51:28 -0500 (0:00:00.044) 0:11:20.717 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 17 January 2025 04:51:28 -0500 (0:00:00.046) 0:11:20.764 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 17 January 2025 04:51:28 -0500 (0:00:00.058) 0:11:20.822 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 17 January 2025 04:51:28 -0500 (0:00:00.055) 0:11:20.877 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 17 January 2025 04:51:28 -0500 (0:00:00.069) 0:11:20.947 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 17 January 2025 04:51:28 -0500 (0:00:00.064) 0:11:21.011 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 17 January 2025 04:51:29 -0500 (0:00:00.057) 0:11:21.068 ******** ok: [managed-node1] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 17 January 2025 04:51:29 -0500 (0:00:00.072) 0:11:21.141 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 17 January 2025 04:51:29 -0500 (0:00:00.073) 0:11:21.215 ******** skipping: [managed-node1] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 17 January 2025 04:51:29 -0500 (0:00:00.058) 0:11:21.273 ******** skipping: [managed-node1] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 17 January 2025 04:51:29 -0500 (0:00:00.058) 0:11:21.332 ******** skipping: [managed-node1] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 17 January 2025 04:51:29 -0500 (0:00:00.057) 0:11:21.389 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Friday 17 January 2025 04:51:29 -0500 (0:00:00.053) 0:11:21.443 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Friday 17 January 2025 04:51:29 -0500 (0:00:00.058) 0:11:21.502 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Friday 17 January 2025 04:51:29 -0500 (0:00:00.061) 0:11:21.563 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Friday 17 January 2025 04:51:29 -0500 (0:00:00.060) 0:11:21.624 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Friday 17 January 2025 04:51:29 -0500 (0:00:00.055) 0:11:21.680 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Friday 17 January 2025 04:51:29 -0500 (0:00:00.056) 0:11:21.736 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Friday 17 January 2025 04:51:29 -0500 (0:00:00.056) 0:11:21.792 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Friday 17 January 2025 04:51:29 -0500 (0:00:00.054) 0:11:21.847 ******** skipping: [managed-node1] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Friday 17 January 2025 04:51:29 -0500 (0:00:00.060) 0:11:21.908 ******** skipping: [managed-node1] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Friday 17 January 2025 04:51:29 -0500 (0:00:00.054) 0:11:21.963 ******** skipping: [managed-node1] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Friday 17 January 2025 04:51:29 -0500 (0:00:00.056) 0:11:22.019 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Friday 17 January 2025 04:51:30 -0500 (0:00:00.049) 0:11:22.069 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Friday 17 January 2025 04:51:30 -0500 (0:00:00.045) 0:11:22.114 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Friday 17 January 2025 04:51:30 -0500 (0:00:00.049) 0:11:22.164 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Friday 17 January 2025 04:51:30 -0500 (0:00:00.060) 0:11:22.224 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Friday 17 January 2025 04:51:30 -0500 (0:00:00.038) 0:11:22.262 ******** ok: [managed-node1] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Friday 17 January 2025 04:51:30 -0500 (0:00:00.043) 0:11:22.305 ******** ok: [managed-node1] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Friday 17 January 2025 04:51:30 -0500 (0:00:00.042) 0:11:22.348 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 17 January 2025 04:51:30 -0500 (0:00:00.037) 0:11:22.385 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 17 January 2025 04:51:30 -0500 (0:00:00.039) 0:11:22.425 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 17 January 2025 04:51:30 -0500 (0:00:00.040) 0:11:22.465 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 17 January 2025 04:51:30 -0500 (0:00:00.041) 0:11:22.506 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 17 January 2025 04:51:30 -0500 (0:00:00.047) 0:11:22.555 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 17 January 2025 04:51:30 -0500 (0:00:00.057) 0:11:22.612 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 17 January 2025 04:51:30 -0500 (0:00:00.055) 0:11:22.667 ******** skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 17 January 2025 04:51:30 -0500 (0:00:00.057) 0:11:22.724 ******** ok: [managed-node1] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Friday 17 January 2025 04:51:30 -0500 (0:00:00.061) 0:11:22.786 ******** ok: [managed-node1] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* managed-node1 : ok=1224 changed=60 unreachable=0 failed=9 skipped=1066 rescued=9 ignored=0 Friday 17 January 2025 04:51:30 -0500 (0:00:00.029) 0:11:22.815 ******** =============================================================================== fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 64.95s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 34.96s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.01s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.00s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 10.86s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 10.83s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 10.28s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 10.12s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Make sure blivet is available ------- 6.06s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.68s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.63s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Get required packages --------------- 4.48s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.38s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.38s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.35s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.32s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.32s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Get required packages --------------- 4.25s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Get required packages --------------- 4.23s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Get required packages --------------- 4.23s /tmp/collections-3Hp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19