ansible-playbook [core 2.12.6]
  config file = /etc/ansible/ansible.cfg
  configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/lib/python3.9/site-packages/ansible
  ansible collection location = /tmp/tmpfdufgi2k
  executable location = /usr/bin/ansible-playbook
  python version = 3.9.13 (main, May 18 2022, 00:00:00) [GCC 11.3.1 20220421 (Red Hat 11.3.1-2)]
  jinja version = 2.11.3
  libyaml = True
Using /etc/ansible/ansible.cfg as config file
Skipping callback 'debug', as we already have a stdout callback.
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.

PLAYBOOK: tests_remove_mount.yml ***********************************************
1 plays in /tmp/tmprua6lrek/tests/tests_remove_mount.yml

PLAY [all] *********************************************************************

TASK [Gathering Facts] *********************************************************
task path: /tmp/tmprua6lrek/tests/tests_remove_mount.yml:2
Wednesday 06 July 2022  11:01:07 +0000 (0:00:00.014)       0:00:00.014 ******** 
ok: [/cache/fedora-35.qcow2.snap]
META: ran handlers

TASK [include_role : linux-system-roles.storage] *******************************
task path: /tmp/tmprua6lrek/tests/tests_remove_mount.yml:12
Wednesday 06 July 2022  11:01:08 +0000 (0:00:01.341)       0:00:01.355 ******** 

TASK [linux-system-roles.storage : set platform/version specific variables] ****
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main.yml:2
Wednesday 06 July 2022  11:01:08 +0000 (0:00:00.034)       0:00:01.390 ******** 
included: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/fedora-35.qcow2.snap

TASK [linux-system-roles.storage : Ensure ansible_facts used by role] **********
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2
Wednesday 06 July 2022  11:01:08 +0000 (0:00:00.031)       0:00:01.422 ******** 
ok: [/cache/fedora-35.qcow2.snap]

TASK [linux-system-roles.storage : Set platform/version specific variables] ****
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8
Wednesday 06 July 2022  11:01:09 +0000 (0:00:00.539)       0:00:01.961 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
ok: [/cache/fedora-35.qcow2.snap] => (item=Fedora.yml) => {
    "ansible_facts": {
        "_storage_copr_packages": [
            {
                "packages": [
                    "vdo",
                    "kmod-vdo"
                ],
                "repository": "rhawalsh/dm-vdo"
            }
        ],
        "_storage_copr_support_packages": [
            "dnf-plugins-core"
        ],
        "blivet_package_list": [
            "python3-blivet",
            "libblockdev-crypto",
            "libblockdev-dm",
            "libblockdev-lvm",
            "libblockdev-mdraid",
            "libblockdev-swap"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/vars/Fedora.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora.yml"
}
skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora_35.yml",
    "skip_reason": "Conditional result was False"
}
skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora_35.yml",
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main.yml:5
Wednesday 06 July 2022  11:01:09 +0000 (0:00:00.055)       0:00:02.016 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main.yml:9
Wednesday 06 July 2022  11:01:09 +0000 (0:00:00.033)       0:00:02.050 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : include the appropriate provider tasks] *****
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main.yml:13
Wednesday 06 July 2022  11:01:09 +0000 (0:00:00.032)       0:00:02.083 ******** 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
included: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/fedora-35.qcow2.snap

TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2
Wednesday 06 July 2022  11:01:09 +0000 (0:00:00.048)       0:00:02.131 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : make sure blivet is available] **************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7
Wednesday 06 July 2022  11:01:09 +0000 (0:00:00.018)       0:00:02.149 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [linux-system-roles.storage : show storage_pools] *************************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14
Wednesday 06 July 2022  11:01:12 +0000 (0:00:02.539)       0:00:04.689 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined"
}

TASK [linux-system-roles.storage : show storage_volumes] ***********************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19
Wednesday 06 July 2022  11:01:12 +0000 (0:00:00.036)       0:00:04.726 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined"
}

TASK [linux-system-roles.storage : get required packages] **********************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24
Wednesday 06 July 2022  11:01:12 +0000 (0:00:00.037)       0:00:04.764 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "actions": [],
    "changed": false,
    "crypts": [],
    "leaves": [],
    "mounts": [],
    "packages": [],
    "pools": [],
    "volumes": []
}

TASK [linux-system-roles.storage : enable copr repositories if needed] *********
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37
Wednesday 06 July 2022  11:01:12 +0000 (0:00:00.772)       0:00:05.537 ******** 
included: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml for /cache/fedora-35.qcow2.snap

TASK [linux-system-roles.storage : check if the COPR support packages should be installed] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:2
Wednesday 06 July 2022  11:01:12 +0000 (0:00:00.043)       0:00:05.580 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']})  => {
    "ansible_loop_var": "repo",
    "changed": false,
    "repo": {
        "packages": [
            "vdo",
            "kmod-vdo"
        ],
        "repository": "rhawalsh/dm-vdo"
    },
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : make sure COPR support packages are present] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:13
Wednesday 06 July 2022  11:01:13 +0000 (0:00:00.043)       0:00:05.623 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : enable COPRs] *******************************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/enable_coprs.yml:18
Wednesday 06 July 2022  11:01:13 +0000 (0:00:00.035)       0:00:05.658 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']})  => {
    "ansible_loop_var": "repo",
    "changed": false,
    "repo": {
        "packages": [
            "vdo",
            "kmod-vdo"
        ],
        "repository": "rhawalsh/dm-vdo"
    },
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : make sure required packages are installed] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44
Wednesday 06 July 2022  11:01:13 +0000 (0:00:00.049)       0:00:05.708 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [linux-system-roles.storage : get service facts] **************************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51
Wednesday 06 July 2022  11:01:15 +0000 (0:00:02.069)       0:00:07.777 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "services": {
            "NetworkManager-dispatcher.service": {
                "name": "NetworkManager-dispatcher.service",
                "source": "systemd",
                "state": "inactive",
                "status": "enabled"
            },
            "NetworkManager-wait-online.service": {
                "name": "NetworkManager-wait-online.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "NetworkManager.service": {
                "name": "NetworkManager.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "arp-ethers.service": {
                "name": "arp-ethers.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "auditd.service": {
                "name": "auditd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "autovt@.service": {
                "name": "autovt@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "blivet.service": {
                "name": "blivet.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "blk-availability.service": {
                "name": "blk-availability.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "bluetooth.service": {
                "name": "bluetooth.service",
                "source": "systemd",
                "state": "inactive",
                "status": "enabled"
            },
            "chrony-wait.service": {
                "name": "chrony-wait.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "chronyd.service": {
                "name": "chronyd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "cloud-config.service": {
                "name": "cloud-config.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-final.service": {
                "name": "cloud-final.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-init-local.service": {
                "name": "cloud-init-local.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-init.service": {
                "name": "cloud-init.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "console-getty.service": {
                "name": "console-getty.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "console-login-helper-messages-gensnippet-os-release.service": {
                "name": "console-login-helper-messages-gensnippet-os-release.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "console-login-helper-messages-gensnippet-ssh-keys.service": {
                "name": "console-login-helper-messages-gensnippet-ssh-keys.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "container-getty@.service": {
                "name": "container-getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "dbus-broker.service": {
                "name": "dbus-broker.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "dbus-org.bluez.service": {
                "name": "dbus-org.bluez.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.hostname1.service": {
                "name": "dbus-org.freedesktop.hostname1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.locale1.service": {
                "name": "dbus-org.freedesktop.locale1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.login1.service": {
                "name": "dbus-org.freedesktop.login1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.nm-dispatcher.service": {
                "name": "dbus-org.freedesktop.nm-dispatcher.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.oom1.service": {
                "name": "dbus-org.freedesktop.oom1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.portable1.service": {
                "name": "dbus-org.freedesktop.portable1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.resolve1.service": {
                "name": "dbus-org.freedesktop.resolve1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.timedate1.service": {
                "name": "dbus-org.freedesktop.timedate1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus.service": {
                "name": "dbus.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "debug-shell.service": {
                "name": "debug-shell.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "display-manager.service": {
                "name": "display-manager.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "dm-event.service": {
                "name": "dm-event.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dmraid-activation.service": {
                "name": "dmraid-activation.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "dnf-makecache.service": {
                "name": "dnf-makecache.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-cmdline.service": {
                "name": "dracut-cmdline.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-initqueue.service": {
                "name": "dracut-initqueue.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-mount.service": {
                "name": "dracut-mount.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-mount.service": {
                "name": "dracut-pre-mount.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-pivot.service": {
                "name": "dracut-pre-pivot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-trigger.service": {
                "name": "dracut-pre-trigger.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-udev.service": {
                "name": "dracut-pre-udev.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-shutdown.service": {
                "name": "dracut-shutdown.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "emergency.service": {
                "name": "emergency.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "fcoe.service": {
                "name": "fcoe.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "fstrim.service": {
                "name": "fstrim.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "fwupd-offline-update.service": {
                "name": "fwupd-offline-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "fwupd-refresh.service": {
                "name": "fwupd-refresh.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "fwupd.service": {
                "name": "fwupd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "getty@.service": {
                "name": "getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "enabled"
            },
            "getty@tty1.service": {
                "name": "getty@tty1.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "grub-boot-indeterminate.service": {
                "name": "grub-boot-indeterminate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "grub2-systemd-integration.service": {
                "name": "grub2-systemd-integration.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "hv_kvp_daemon.service": {
                "name": "hv_kvp_daemon.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "initrd-cleanup.service": {
                "name": "initrd-cleanup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-parse-etc.service": {
                "name": "initrd-parse-etc.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-switch-root.service": {
                "name": "initrd-switch-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-udevadm-cleanup-db.service": {
                "name": "initrd-udevadm-cleanup-db.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "iscsi-shutdown.service": {
                "name": "iscsi-shutdown.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "iscsi.service": {
                "name": "iscsi.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "iscsid.service": {
                "name": "iscsid.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "kmod-static-nodes.service": {
                "name": "kmod-static-nodes.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "ldconfig.service": {
                "name": "ldconfig.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "lvm2-activation-early.service": {
                "name": "lvm2-activation-early.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "lvm2-activation.service": {
                "name": "lvm2-activation.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "lvm2-lvmpolld.service": {
                "name": "lvm2-lvmpolld.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "lvm2-monitor.service": {
                "name": "lvm2-monitor.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "lvm2-pvscan@.service": {
                "name": "lvm2-pvscan@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "man-db-cache-update.service": {
                "name": "man-db-cache-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "man-db-restart-cache-update.service": {
                "name": "man-db-restart-cache-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "mdadm-grow-continue@.service": {
                "name": "mdadm-grow-continue@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "mdadm-last-resort@.service": {
                "name": "mdadm-last-resort@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "mdcheck_continue.service": {
                "name": "mdcheck_continue.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "mdcheck_start.service": {
                "name": "mdcheck_start.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "mdmon@.service": {
                "name": "mdmon@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "mdmonitor-oneshot.service": {
                "name": "mdmonitor-oneshot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "mdmonitor.service": {
                "name": "mdmonitor.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "modprobe@.service": {
                "name": "modprobe@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "modprobe@configfs.service": {
                "name": "modprobe@configfs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@drm.service": {
                "name": "modprobe@drm.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@fuse.service": {
                "name": "modprobe@fuse.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "multipathd.service": {
                "name": "multipathd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "ndctl-monitor.service": {
                "name": "ndctl-monitor.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "network.service": {
                "name": "network.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "nis-domainname.service": {
                "name": "nis-domainname.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "ntpd.service": {
                "name": "ntpd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ntpdate.service": {
                "name": "ntpdate.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "pam_namespace.service": {
                "name": "pam_namespace.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "pcscd.service": {
                "name": "pcscd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "indirect"
            },
            "plymouth-quit-wait.service": {
                "name": "plymouth-quit-wait.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "plymouth-start.service": {
                "name": "plymouth-start.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "polkit.service": {
                "name": "polkit.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "qemu-guest-agent.service": {
                "name": "qemu-guest-agent.service",
                "source": "systemd",
                "state": "inactive",
                "status": "enabled"
            },
            "quotaon.service": {
                "name": "quotaon.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "raid-check.service": {
                "name": "raid-check.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rbdmap.service": {
                "name": "rbdmap.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "rc-local.service": {
                "name": "rc-local.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rdisc.service": {
                "name": "rdisc.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rescue.service": {
                "name": "rescue.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpmdb-rebuild.service": {
                "name": "rpmdb-rebuild.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "selinux-autorelabel-mark.service": {
                "name": "selinux-autorelabel-mark.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "selinux-autorelabel.service": {
                "name": "selinux-autorelabel.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "selinux-check-proper-disable.service": {
                "name": "selinux-check-proper-disable.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "serial-getty@.service": {
                "name": "serial-getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "indirect"
            },
            "serial-getty@ttyS0.service": {
                "name": "serial-getty@ttyS0.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "snapd.seeded.service": {
                "name": "snapd.seeded.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "sntp.service": {
                "name": "sntp.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "sshd-keygen.service": {
                "name": "sshd-keygen.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "sshd-keygen@.service": {
                "name": "sshd-keygen@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "sshd-keygen@ecdsa.service": {
                "name": "sshd-keygen@ecdsa.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-keygen@ed25519.service": {
                "name": "sshd-keygen@ed25519.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-keygen@rsa.service": {
                "name": "sshd-keygen@rsa.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd.service": {
                "name": "sshd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "sshd@.service": {
                "name": "sshd@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "sssd-autofs.service": {
                "name": "sssd-autofs.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-kcm.service": {
                "name": "sssd-kcm.service",
                "source": "systemd",
                "state": "stopped",
                "status": "indirect"
            },
            "sssd-nss.service": {
                "name": "sssd-nss.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-pac.service": {
                "name": "sssd-pac.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-pam.service": {
                "name": "sssd-pam.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-ssh.service": {
                "name": "sssd-ssh.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-sudo.service": {
                "name": "sssd-sudo.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd.service": {
                "name": "sssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "syslog.service": {
                "name": "syslog.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "system-update-cleanup.service": {
                "name": "system-update-cleanup.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-ask-password-console.service": {
                "name": "systemd-ask-password-console.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-ask-password-wall.service": {
                "name": "systemd-ask-password-wall.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-backlight@.service": {
                "name": "systemd-backlight@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-binfmt.service": {
                "name": "systemd-binfmt.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-bless-boot.service": {
                "name": "systemd-bless-boot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-boot-check-no-failures.service": {
                "name": "systemd-boot-check-no-failures.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-boot-system-token.service": {
                "name": "systemd-boot-system-token.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-coredump@.service": {
                "name": "systemd-coredump@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-exit.service": {
                "name": "systemd-exit.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-firstboot.service": {
                "name": "systemd-firstboot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-fsck-root.service": {
                "name": "systemd-fsck-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-fsck@.service": {
                "name": "systemd-fsck@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-fsck@dev-disk-by\\x2duuid-5B84\\x2d6DD7.service": {
                "name": "systemd-fsck@dev-disk-by\\x2duuid-5B84\\x2d6DD7.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "systemd-fsck@dev-disk-by\\x2duuid-5f2f82d0\\x2dae0a\\x2d4574\\x2d8811\\x2d62a31a51a870.service": {
                "name": "systemd-fsck@dev-disk-by\\x2duuid-5f2f82d0\\x2dae0a\\x2d4574\\x2d8811\\x2d62a31a51a870.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "systemd-fsck@dev-vdb1.service": {
                "name": "systemd-fsck@dev-vdb1.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "systemd-fsck@dev-vdc1.service": {
                "name": "systemd-fsck@dev-vdc1.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "systemd-halt.service": {
                "name": "systemd-halt.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-hibernate-resume@.service": {
                "name": "systemd-hibernate-resume@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-hibernate.service": {
                "name": "systemd-hibernate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-homed-activate.service": {
                "name": "systemd-homed-activate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-homed.service": {
                "name": "systemd-homed.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-hostnamed.service": {
                "name": "systemd-hostnamed.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-hwdb-update.service": {
                "name": "systemd-hwdb-update.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hybrid-sleep.service": {
                "name": "systemd-hybrid-sleep.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-initctl.service": {
                "name": "systemd-initctl.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journal-catalog-update.service": {
                "name": "systemd-journal-catalog-update.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journal-flush.service": {
                "name": "systemd-journal-flush.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journald.service": {
                "name": "systemd-journald.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-journald@.service": {
                "name": "systemd-journald@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-kexec.service": {
                "name": "systemd-kexec.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-localed.service": {
                "name": "systemd-localed.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-logind.service": {
                "name": "systemd-logind.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-machine-id-commit.service": {
                "name": "systemd-machine-id-commit.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-modules-load.service": {
                "name": "systemd-modules-load.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-network-generator.service": {
                "name": "systemd-network-generator.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-networkd-wait-online.service": {
                "name": "systemd-networkd-wait-online.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "systemd-networkd.service": {
                "name": "systemd-networkd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "systemd-oomd.service": {
                "name": "systemd-oomd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "systemd-portabled.service": {
                "name": "systemd-portabled.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-poweroff.service": {
                "name": "systemd-poweroff.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-pstore.service": {
                "name": "systemd-pstore.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-quotacheck.service": {
                "name": "systemd-quotacheck.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-random-seed.service": {
                "name": "systemd-random-seed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-reboot.service": {
                "name": "systemd-reboot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-remount-fs.service": {
                "name": "systemd-remount-fs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled-runtime"
            },
            "systemd-repart.service": {
                "name": "systemd-repart.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-resolved.service": {
                "name": "systemd-resolved.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "systemd-rfkill.service": {
                "name": "systemd-rfkill.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-suspend-then-hibernate.service": {
                "name": "systemd-suspend-then-hibernate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-suspend.service": {
                "name": "systemd-suspend.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-sysctl.service": {
                "name": "systemd-sysctl.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-sysext.service": {
                "name": "systemd-sysext.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-sysusers.service": {
                "name": "systemd-sysusers.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-time-wait-sync.service": {
                "name": "systemd-time-wait-sync.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-timedated.service": {
                "name": "systemd-timedated.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-timesyncd.service": {
                "name": "systemd-timesyncd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "systemd-tmpfiles-clean.service": {
                "name": "systemd-tmpfiles-clean.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup-dev.service": {
                "name": "systemd-tmpfiles-setup-dev.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup.service": {
                "name": "systemd-tmpfiles-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udev-settle.service": {
                "name": "systemd-udev-settle.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udev-trigger.service": {
                "name": "systemd-udev-trigger.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udevd.service": {
                "name": "systemd-udevd.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-update-done.service": {
                "name": "systemd-update-done.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-update-utmp-runlevel.service": {
                "name": "systemd-update-utmp-runlevel.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-update-utmp.service": {
                "name": "systemd-update-utmp.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-user-sessions.service": {
                "name": "systemd-user-sessions.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-userdbd.service": {
                "name": "systemd-userdbd.service",
                "source": "systemd",
                "state": "running",
                "status": "indirect"
            },
            "systemd-vconsole-setup.service": {
                "name": "systemd-vconsole-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-volatile-root.service": {
                "name": "systemd-volatile-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-zram-setup@.service": {
                "name": "systemd-zram-setup@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-zram-setup@zram0.service": {
                "name": "systemd-zram-setup@zram0.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "udisks2.service": {
                "name": "udisks2.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "unbound-anchor.service": {
                "name": "unbound-anchor.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "user-runtime-dir@.service": {
                "name": "user-runtime-dir@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "user-runtime-dir@0.service": {
                "name": "user-runtime-dir@0.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "user@.service": {
                "name": "user@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "user@0.service": {
                "name": "user@0.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            }
        }
    },
    "changed": false
}

TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58
Wednesday 06 July 2022  11:01:17 +0000 (0:00:01.988)       0:00:09.766 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_cryptsetup_services": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] *******
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71
Wednesday 06 July 2022  11:01:17 +0000 (0:00:00.058)       0:00:09.824 ******** 

TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77
Wednesday 06 July 2022  11:01:17 +0000 (0:00:00.023)       0:00:09.847 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "actions": [],
    "changed": false,
    "crypts": [],
    "leaves": [],
    "mounts": [],
    "packages": [],
    "pools": [],
    "volumes": []
}

TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91
Wednesday 06 July 2022  11:01:17 +0000 (0:00:00.576)       0:00:10.424 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] *****
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103
Wednesday 06 July 2022  11:01:17 +0000 (0:00:00.037)       0:00:10.462 ******** 

TASK [linux-system-roles.storage : show blivet_output] *************************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109
Wednesday 06 July 2022  11:01:17 +0000 (0:00:00.023)       0:00:10.486 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "blivet_output": {
        "actions": [],
        "changed": false,
        "crypts": [],
        "failed": false,
        "leaves": [],
        "mounts": [],
        "packages": [],
        "pools": [],
        "volumes": []
    }
}

TASK [linux-system-roles.storage : set the list of pools for test verification] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114
Wednesday 06 July 2022  11:01:17 +0000 (0:00:00.036)       0:00:10.522 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : set the list of volumes for test verification] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118
Wednesday 06 July 2022  11:01:17 +0000 (0:00:00.036)       0:00:10.558 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : remove obsolete mounts] *********************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134
Wednesday 06 July 2022  11:01:17 +0000 (0:00:00.037)       0:00:10.596 ******** 

TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146
Wednesday 06 July 2022  11:01:18 +0000 (0:00:00.038)       0:00:10.634 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : set up new/current mounts] ******************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151
Wednesday 06 July 2022  11:01:18 +0000 (0:00:00.023)       0:00:10.658 ******** 

TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163
Wednesday 06 July 2022  11:01:18 +0000 (0:00:00.035)       0:00:10.694 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171
Wednesday 06 July 2022  11:01:18 +0000 (0:00:00.026)       0:00:10.721 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1657093385.4860332,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709",
        "ctime": 1657005647.423,
        "dev": 31,
        "device_type": 0,
        "executable": false,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 267,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/x-empty",
        "mode": "0600",
        "mtime": 1657005500.596,
        "nlink": 1,
        "path": "/etc/crypttab",
        "pw_name": "root",
        "readable": true,
        "rgrp": false,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": "10",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176
Wednesday 06 July 2022  11:01:18 +0000 (0:00:00.523)       0:00:11.244 ******** 

TASK [linux-system-roles.storage : Update facts] *******************************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198
Wednesday 06 July 2022  11:01:18 +0000 (0:00:00.021)       0:00:11.266 ******** 
ok: [/cache/fedora-35.qcow2.snap]
META: role_complete for /cache/fedora-35.qcow2.snap

TASK [Mark tasks to be skipped] ************************************************
task path: /tmp/tmprua6lrek/tests/tests_remove_mount.yml:15
Wednesday 06 July 2022  11:01:19 +0000 (0:00:00.947)       0:00:12.213 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_skip_checks": [
            "blivet_available",
            "packages_installed",
            "service_facts"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmprua6lrek/tests/tests_remove_mount.yml:22
Wednesday 06 July 2022  11:01:19 +0000 (0:00:00.034)       0:00:12.248 ******** 
included: /tmp/tmprua6lrek/tests/get_unused_disk.yml for /cache/fedora-35.qcow2.snap

TASK [Find unused disks in the system] *****************************************
task path: /tmp/tmprua6lrek/tests/get_unused_disk.yml:2
Wednesday 06 July 2022  11:01:19 +0000 (0:00:00.038)       0:00:12.286 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "disks": [
        "sda"
    ]
}

TASK [Set unused_disks if necessary] *******************************************
task path: /tmp/tmprua6lrek/tests/get_unused_disk.yml:9
Wednesday 06 July 2022  11:01:20 +0000 (0:00:00.544)       0:00:12.830 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "unused_disks": [
            "sda"
        ]
    },
    "changed": false
}

TASK [Exit playbook when there's not enough unused disks in the system] ********
task path: /tmp/tmprua6lrek/tests/get_unused_disk.yml:14
Wednesday 06 July 2022  11:01:20 +0000 (0:00:00.038)       0:00:12.868 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Print unused disks] ******************************************************
task path: /tmp/tmprua6lrek/tests/get_unused_disk.yml:19
Wednesday 06 July 2022  11:01:20 +0000 (0:00:00.039)       0:00:12.908 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "unused_disks": [
        "sda"
    ]
}

TASK [Create a LVM logical volume mounted at "/opt/test1"] *********************
task path: /tmp/tmprua6lrek/tests/tests_remove_mount.yml:27
Wednesday 06 July 2022  11:01:20 +0000 (0:00:00.068)       0:00:12.977 ******** 

TASK [linux-system-roles.storage : set platform/version specific variables] ****
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main.yml:2
Wednesday 06 July 2022  11:01:20 +0000 (0:00:00.042)       0:00:13.020 ******** 
included: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/fedora-35.qcow2.snap

TASK [linux-system-roles.storage : Ensure ansible_facts used by role] **********
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2
Wednesday 06 July 2022  11:01:20 +0000 (0:00:00.062)       0:00:13.082 ******** 
ok: [/cache/fedora-35.qcow2.snap]

TASK [linux-system-roles.storage : Set platform/version specific variables] ****
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8
Wednesday 06 July 2022  11:01:20 +0000 (0:00:00.533)       0:00:13.615 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
ok: [/cache/fedora-35.qcow2.snap] => (item=Fedora.yml) => {
    "ansible_facts": {
        "_storage_copr_packages": [
            {
                "packages": [
                    "vdo",
                    "kmod-vdo"
                ],
                "repository": "rhawalsh/dm-vdo"
            }
        ],
        "_storage_copr_support_packages": [
            "dnf-plugins-core"
        ],
        "blivet_package_list": [
            "python3-blivet",
            "libblockdev-crypto",
            "libblockdev-dm",
            "libblockdev-lvm",
            "libblockdev-mdraid",
            "libblockdev-swap"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/vars/Fedora.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora.yml"
}
skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora_35.yml",
    "skip_reason": "Conditional result was False"
}
skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora_35.yml",
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main.yml:5
Wednesday 06 July 2022  11:01:21 +0000 (0:00:00.062)       0:00:13.677 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main.yml:9
Wednesday 06 July 2022  11:01:21 +0000 (0:00:00.032)       0:00:13.710 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : include the appropriate provider tasks] *****
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main.yml:13
Wednesday 06 July 2022  11:01:21 +0000 (0:00:00.035)       0:00:13.746 ******** 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
included: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/fedora-35.qcow2.snap

TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2
Wednesday 06 July 2022  11:01:21 +0000 (0:00:00.049)       0:00:13.796 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : make sure blivet is available] **************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7
Wednesday 06 July 2022  11:01:21 +0000 (0:00:00.022)       0:00:13.818 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : show storage_pools] *************************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14
Wednesday 06 July 2022  11:01:21 +0000 (0:00:00.029)       0:00:13.848 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_pools": [
        {
            "disks": [
                "sda"
            ],
            "name": "foo",
            "volumes": [
                {
                    "mount_point": "/opt/test1",
                    "name": "test1",
                    "size": "3g"
                }
            ]
        }
    ]
}

TASK [linux-system-roles.storage : show storage_volumes] ***********************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19
Wednesday 06 July 2022  11:01:21 +0000 (0:00:00.037)       0:00:13.886 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined"
}

TASK [linux-system-roles.storage : get required packages] **********************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24
Wednesday 06 July 2022  11:01:21 +0000 (0:00:00.037)       0:00:13.923 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : enable copr repositories if needed] *********
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37
Wednesday 06 July 2022  11:01:21 +0000 (0:00:00.029)       0:00:13.953 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : make sure required packages are installed] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44
Wednesday 06 July 2022  11:01:21 +0000 (0:00:00.031)       0:00:13.984 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : get service facts] **************************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51
Wednesday 06 July 2022  11:01:21 +0000 (0:00:00.029)       0:00:14.014 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58
Wednesday 06 July 2022  11:01:21 +0000 (0:00:00.028)       0:00:14.042 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_cryptsetup_services": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] *******
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71
Wednesday 06 July 2022  11:01:21 +0000 (0:00:00.057)       0:00:14.099 ******** 

TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77
Wednesday 06 July 2022  11:01:21 +0000 (0:00:00.023)       0:00:14.123 ******** 
changed: [/cache/fedora-35.qcow2.snap] => {
    "actions": [
        {
            "action": "create format",
            "device": "/dev/sda",
            "fs_type": "lvmpv"
        },
        {
            "action": "create device",
            "device": "/dev/foo",
            "fs_type": null
        },
        {
            "action": "create device",
            "device": "/dev/mapper/foo-test1",
            "fs_type": null
        },
        {
            "action": "create format",
            "device": "/dev/mapper/foo-test1",
            "fs_type": "xfs"
        }
    ],
    "changed": true,
    "crypts": [],
    "leaves": [
        "/dev/sr0",
        "/dev/vda1",
        "/dev/vda2",
        "/dev/vda3",
        "/dev/vda4",
        "/dev/vda5",
        "/dev/sdb",
        "/dev/sdc",
        "/dev/nvme0n1",
        "/dev/nvme1n1",
        "/dev/nvme2n1",
        "/dev/vdb1",
        "/dev/vdc1",
        "/dev/vdd",
        "/dev/vde",
        "/dev/vdf",
        "/dev/zram0",
        "/dev/mapper/foo-test1"
    ],
    "mounts": [
        {
            "dump": 0,
            "fstype": "xfs",
            "opts": "defaults",
            "passno": 0,
            "path": "/opt/test1",
            "src": "/dev/mapper/foo-test1",
            "state": "mounted"
        }
    ],
    "packages": [
        "lvm2",
        "xfsprogs",
        "dosfstools",
        "e2fsprogs",
        "btrfs-progs"
    ],
    "pools": [
        {
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "name": "foo",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "state": "present",
            "type": "lvm",
            "volumes": [
                {
                    "_device": "/dev/mapper/foo-test1",
                    "_kernel_device": "/dev/dm-0",
                    "_mount_id": "/dev/mapper/foo-test1",
                    "_raw_device": "/dev/mapper/foo-test1",
                    "_raw_kernel_device": "/dev/dm-0",
                    "cache_devices": [],
                    "cache_mode": null,
                    "cache_size": 0,
                    "cached": false,
                    "compression": null,
                    "deduplication": null,
                    "disks": [],
                    "encryption": false,
                    "encryption_cipher": null,
                    "encryption_key": null,
                    "encryption_key_size": null,
                    "encryption_luks_version": null,
                    "encryption_password": null,
                    "fs_create_options": "",
                    "fs_label": "",
                    "fs_overwrite_existing": true,
                    "fs_type": "xfs",
                    "mount_check": 0,
                    "mount_device_identifier": "uuid",
                    "mount_options": "defaults",
                    "mount_passno": 0,
                    "mount_point": "/opt/test1",
                    "name": "test1",
                    "raid_chunk_size": null,
                    "raid_device_count": null,
                    "raid_disks": [],
                    "raid_level": null,
                    "raid_metadata_version": null,
                    "raid_spare_count": null,
                    "size": "3g",
                    "state": "present",
                    "thin": false,
                    "thin_pool_name": null,
                    "thin_pool_size": null,
                    "type": "lvm",
                    "vdo_pool_size": null
                }
            ]
        }
    ],
    "volumes": []
}

TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91
Wednesday 06 July 2022  11:01:24 +0000 (0:00:02.596)       0:00:16.719 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] *****
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103
Wednesday 06 July 2022  11:01:24 +0000 (0:00:00.037)       0:00:16.756 ******** 

TASK [linux-system-roles.storage : show blivet_output] *************************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109
Wednesday 06 July 2022  11:01:24 +0000 (0:00:00.021)       0:00:16.777 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "blivet_output": {
        "actions": [
            {
                "action": "create format",
                "device": "/dev/sda",
                "fs_type": "lvmpv"
            },
            {
                "action": "create device",
                "device": "/dev/foo",
                "fs_type": null
            },
            {
                "action": "create device",
                "device": "/dev/mapper/foo-test1",
                "fs_type": null
            },
            {
                "action": "create format",
                "device": "/dev/mapper/foo-test1",
                "fs_type": "xfs"
            }
        ],
        "changed": true,
        "crypts": [],
        "failed": false,
        "leaves": [
            "/dev/sr0",
            "/dev/vda1",
            "/dev/vda2",
            "/dev/vda3",
            "/dev/vda4",
            "/dev/vda5",
            "/dev/sdb",
            "/dev/sdc",
            "/dev/nvme0n1",
            "/dev/nvme1n1",
            "/dev/nvme2n1",
            "/dev/vdb1",
            "/dev/vdc1",
            "/dev/vdd",
            "/dev/vde",
            "/dev/vdf",
            "/dev/zram0",
            "/dev/mapper/foo-test1"
        ],
        "mounts": [
            {
                "dump": 0,
                "fstype": "xfs",
                "opts": "defaults",
                "passno": 0,
                "path": "/opt/test1",
                "src": "/dev/mapper/foo-test1",
                "state": "mounted"
            }
        ],
        "packages": [
            "lvm2",
            "xfsprogs",
            "dosfstools",
            "e2fsprogs",
            "btrfs-progs"
        ],
        "pools": [
            {
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "name": "foo",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "state": "present",
                "type": "lvm",
                "volumes": [
                    {
                        "_device": "/dev/mapper/foo-test1",
                        "_kernel_device": "/dev/dm-0",
                        "_mount_id": "/dev/mapper/foo-test1",
                        "_raw_device": "/dev/mapper/foo-test1",
                        "_raw_kernel_device": "/dev/dm-0",
                        "cache_devices": [],
                        "cache_mode": null,
                        "cache_size": 0,
                        "cached": false,
                        "compression": null,
                        "deduplication": null,
                        "disks": [],
                        "encryption": false,
                        "encryption_cipher": null,
                        "encryption_key": null,
                        "encryption_key_size": null,
                        "encryption_luks_version": null,
                        "encryption_password": null,
                        "fs_create_options": "",
                        "fs_label": "",
                        "fs_overwrite_existing": true,
                        "fs_type": "xfs",
                        "mount_check": 0,
                        "mount_device_identifier": "uuid",
                        "mount_options": "defaults",
                        "mount_passno": 0,
                        "mount_point": "/opt/test1",
                        "name": "test1",
                        "raid_chunk_size": null,
                        "raid_device_count": null,
                        "raid_disks": [],
                        "raid_level": null,
                        "raid_metadata_version": null,
                        "raid_spare_count": null,
                        "size": "3g",
                        "state": "present",
                        "thin": false,
                        "thin_pool_name": null,
                        "thin_pool_size": null,
                        "type": "lvm",
                        "vdo_pool_size": null
                    }
                ]
            }
        ],
        "volumes": []
    }
}

TASK [linux-system-roles.storage : set the list of pools for test verification] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114
Wednesday 06 July 2022  11:01:24 +0000 (0:00:00.037)       0:00:16.815 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": [
            {
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "name": "foo",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "state": "present",
                "type": "lvm",
                "volumes": [
                    {
                        "_device": "/dev/mapper/foo-test1",
                        "_kernel_device": "/dev/dm-0",
                        "_mount_id": "/dev/mapper/foo-test1",
                        "_raw_device": "/dev/mapper/foo-test1",
                        "_raw_kernel_device": "/dev/dm-0",
                        "cache_devices": [],
                        "cache_mode": null,
                        "cache_size": 0,
                        "cached": false,
                        "compression": null,
                        "deduplication": null,
                        "disks": [],
                        "encryption": false,
                        "encryption_cipher": null,
                        "encryption_key": null,
                        "encryption_key_size": null,
                        "encryption_luks_version": null,
                        "encryption_password": null,
                        "fs_create_options": "",
                        "fs_label": "",
                        "fs_overwrite_existing": true,
                        "fs_type": "xfs",
                        "mount_check": 0,
                        "mount_device_identifier": "uuid",
                        "mount_options": "defaults",
                        "mount_passno": 0,
                        "mount_point": "/opt/test1",
                        "name": "test1",
                        "raid_chunk_size": null,
                        "raid_device_count": null,
                        "raid_disks": [],
                        "raid_level": null,
                        "raid_metadata_version": null,
                        "raid_spare_count": null,
                        "size": "3g",
                        "state": "present",
                        "thin": false,
                        "thin_pool_name": null,
                        "thin_pool_size": null,
                        "type": "lvm",
                        "vdo_pool_size": null
                    }
                ]
            }
        ]
    },
    "changed": false
}

TASK [linux-system-roles.storage : set the list of volumes for test verification] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118
Wednesday 06 July 2022  11:01:24 +0000 (0:00:00.066)       0:00:16.881 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : remove obsolete mounts] *********************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134
Wednesday 06 July 2022  11:01:24 +0000 (0:00:00.101)       0:00:16.983 ******** 

TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146
Wednesday 06 July 2022  11:01:24 +0000 (0:00:00.039)       0:00:17.022 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [linux-system-roles.storage : set up new/current mounts] ******************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151
Wednesday 06 July 2022  11:01:25 +0000 (0:00:00.998)       0:00:18.020 ******** 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
changed: [/cache/fedora-35.qcow2.snap] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted'}) => {
    "ansible_loop_var": "mount_info",
    "backup_file": "",
    "boot": "yes",
    "changed": true,
    "dump": "0",
    "fstab": "/etc/fstab",
    "fstype": "xfs",
    "mount_info": {
        "dump": 0,
        "fstype": "xfs",
        "opts": "defaults",
        "passno": 0,
        "path": "/opt/test1",
        "src": "/dev/mapper/foo-test1",
        "state": "mounted"
    },
    "name": "/opt/test1",
    "opts": "defaults",
    "passno": "0",
    "src": "/dev/mapper/foo-test1"
}

TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163
Wednesday 06 July 2022  11:01:26 +0000 (0:00:00.740)       0:00:18.760 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171
Wednesday 06 July 2022  11:01:26 +0000 (0:00:00.741)       0:00:19.501 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1657093385.4860332,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709",
        "ctime": 1657005647.423,
        "dev": 31,
        "device_type": 0,
        "executable": false,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 267,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/x-empty",
        "mode": "0600",
        "mtime": 1657005500.596,
        "nlink": 1,
        "path": "/etc/crypttab",
        "pw_name": "root",
        "readable": true,
        "rgrp": false,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": "10",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176
Wednesday 06 July 2022  11:01:27 +0000 (0:00:00.404)       0:00:19.906 ******** 

TASK [linux-system-roles.storage : Update facts] *******************************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198
Wednesday 06 July 2022  11:01:27 +0000 (0:00:00.022)       0:00:19.928 ******** 
ok: [/cache/fedora-35.qcow2.snap]
META: role_complete for /cache/fedora-35.qcow2.snap

TASK [include_tasks] ***********************************************************
task path: /tmp/tmprua6lrek/tests/tests_remove_mount.yml:39
Wednesday 06 July 2022  11:01:28 +0000 (0:00:00.950)       0:00:20.878 ******** 
included: /tmp/tmprua6lrek/tests/verify-role-results.yml for /cache/fedora-35.qcow2.snap

TASK [Print out pool information] **********************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:1
Wednesday 06 July 2022  11:01:28 +0000 (0:00:00.042)       0:00:20.920 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "_storage_pools_list": [
        {
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "name": "foo",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "state": "present",
            "type": "lvm",
            "volumes": [
                {
                    "_device": "/dev/mapper/foo-test1",
                    "_kernel_device": "/dev/dm-0",
                    "_mount_id": "/dev/mapper/foo-test1",
                    "_raw_device": "/dev/mapper/foo-test1",
                    "_raw_kernel_device": "/dev/dm-0",
                    "cache_devices": [],
                    "cache_mode": null,
                    "cache_size": 0,
                    "cached": false,
                    "compression": null,
                    "deduplication": null,
                    "disks": [],
                    "encryption": false,
                    "encryption_cipher": null,
                    "encryption_key": null,
                    "encryption_key_size": null,
                    "encryption_luks_version": null,
                    "encryption_password": null,
                    "fs_create_options": "",
                    "fs_label": "",
                    "fs_overwrite_existing": true,
                    "fs_type": "xfs",
                    "mount_check": 0,
                    "mount_device_identifier": "uuid",
                    "mount_options": "defaults",
                    "mount_passno": 0,
                    "mount_point": "/opt/test1",
                    "name": "test1",
                    "raid_chunk_size": null,
                    "raid_device_count": null,
                    "raid_disks": [],
                    "raid_level": null,
                    "raid_metadata_version": null,
                    "raid_spare_count": null,
                    "size": "3g",
                    "state": "present",
                    "thin": false,
                    "thin_pool_name": null,
                    "thin_pool_size": null,
                    "type": "lvm",
                    "vdo_pool_size": null
                }
            ]
        }
    ]
}

TASK [Print out volume information] ********************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:6
Wednesday 06 July 2022  11:01:28 +0000 (0:00:00.053)       0:00:20.974 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Collect info about the volumes.] *****************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:14
Wednesday 06 July 2022  11:01:28 +0000 (0:00:00.034)       0:00:21.009 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "info": {
        "/dev/mapper/foo-test1": {
            "fstype": "xfs",
            "label": "",
            "name": "/dev/mapper/foo-test1",
            "size": "3G",
            "type": "lvm",
            "uuid": "e363aca8-4773-4789-a338-0cadc3e349f5"
        },
        "/dev/nvme0n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme0n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme1n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme1n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme2n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme2n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sda": {
            "fstype": "LVM2_member",
            "label": "",
            "name": "/dev/sda",
            "size": "10G",
            "type": "disk",
            "uuid": "jTr21u-XFvQ-FDKv-U00Y-g4iZ-7hhe-ySdWgB"
        },
        "/dev/sdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdb",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdc",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sr0": {
            "fstype": "iso9660",
            "label": "cidata",
            "name": "/dev/sr0",
            "size": "364K",
            "type": "rom",
            "uuid": "2022-07-06-11-00-54-00"
        },
        "/dev/vda": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda",
            "size": "4G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vda1": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda1",
            "size": "1M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda2": {
            "fstype": "ext4",
            "label": "boot",
            "name": "/dev/vda2",
            "size": "500M",
            "type": "partition",
            "uuid": "5f2f82d0-ae0a-4574-8811-62a31a51a870"
        },
        "/dev/vda3": {
            "fstype": "vfat",
            "label": "",
            "name": "/dev/vda3",
            "size": "100M",
            "type": "partition",
            "uuid": "5B84-6DD7"
        },
        "/dev/vda4": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda4",
            "size": "4M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda5": {
            "fstype": "btrfs",
            "label": "fedora",
            "name": "/dev/vda5",
            "size": "3.4G",
            "type": "partition",
            "uuid": "fbdaf05f-1a41-4dc5-b56e-a10edb430f9a"
        },
        "/dev/vdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdb",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdb1": {
            "fstype": "ext4",
            "label": "yumcache",
            "name": "/dev/vdb1",
            "size": "2G",
            "type": "partition",
            "uuid": "e676dfc5-3e4b-4331-8ede-73c3f56d2cab"
        },
        "/dev/vdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdc",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdc1": {
            "fstype": "ext4",
            "label": "yumvarlib",
            "name": "/dev/vdc1",
            "size": "2G",
            "type": "partition",
            "uuid": "0c299eb4-81f5-4414-b246-b95738eb82f0"
        },
        "/dev/vdd": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdd",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vde": {
            "fstype": "",
            "label": "",
            "name": "/dev/vde",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdf": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdf",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/zram0": {
            "fstype": "",
            "label": "",
            "name": "/dev/zram0",
            "size": "1.9G",
            "type": "disk",
            "uuid": ""
        }
    }
}

TASK [Read the /etc/fstab file for volume existence] ***************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:19
Wednesday 06 July 2022  11:01:28 +0000 (0:00:00.555)       0:00:21.564 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/fstab"
    ],
    "delta": "0:00:00.003214",
    "end": "2022-07-06 11:01:28.838999",
    "rc": 0,
    "start": "2022-07-06 11:01:28.835785"
}

STDOUT:


#
# /etc/fstab
# Created by anaconda on Tue Jul  5 07:18:20 2022
#
# Accessible filesystems, by reference, are maintained under '/dev/disk/'.
# See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info.
#
# After editing this file, run 'systemctl daemon-reload' to update systemd
# units generated from this file.
#
UUID=fbdaf05f-1a41-4dc5-b56e-a10edb430f9a /                       btrfs   subvol=root,compress=zstd:1 0 0
UUID=5f2f82d0-ae0a-4574-8811-62a31a51a870 /boot                   ext4    defaults        1 2
UUID=5B84-6DD7          /boot/efi               vfat    defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2
UUID=fbdaf05f-1a41-4dc5-b56e-a10edb430f9a /home                   btrfs   subvol=home,compress=zstd:1 0 0
/dev/vdb1	/var/cache/dnf	auto	defaults,nofail,comment=cloudconfig	0	2
/dev/vdc1	/var/lib/dnf	auto	defaults,nofail,comment=cloudconfig	0	2
/dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0

TASK [Read the /etc/crypttab file] *********************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:24
Wednesday 06 July 2022  11:01:29 +0000 (0:00:00.563)       0:00:22.128 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/crypttab"
    ],
    "delta": "0:00:00.003998",
    "end": "2022-07-06 11:01:29.277479",
    "failed_when_result": false,
    "rc": 0,
    "start": "2022-07-06 11:01:29.273481"
}

TASK [Verify the volumes listed in storage_pools were correctly managed] *******
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:33
Wednesday 06 July 2022  11:01:29 +0000 (0:00:00.431)       0:00:22.559 ******** 
included: /tmp/tmprua6lrek/tests/test-verify-pool.yml for /cache/fedora-35.qcow2.snap => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}], 'raid_chunk_size': None})

TASK [Set _storage_pool_tests] *************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool.yml:5
Wednesday 06 July 2022  11:01:30 +0000 (0:00:00.076)       0:00:22.636 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pool_tests": [
            "members",
            "volumes"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool.yml:18
Wednesday 06 July 2022  11:01:30 +0000 (0:00:00.044)       0:00:22.681 ******** 
included: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml for /cache/fedora-35.qcow2.snap => (item=members)
included: /tmp/tmprua6lrek/tests/test-verify-pool-volumes.yml for /cache/fedora-35.qcow2.snap => (item=volumes)

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:1
Wednesday 06 July 2022  11:01:30 +0000 (0:00:00.050)       0:00:22.731 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_count": "1",
        "_storage_test_pool_pvs_lvm": [
            "/dev/sda"
        ]
    },
    "changed": false
}

TASK [Get the canonical device path for each member device] ********************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:10
Wednesday 06 July 2022  11:01:30 +0000 (0:00:00.055)       0:00:22.786 ******** 
ok: [/cache/fedora-35.qcow2.snap] => (item=/dev/sda) => {
    "ansible_loop_var": "pv",
    "changed": false,
    "device": "/dev/sda",
    "pv": "/dev/sda"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:19
Wednesday 06 July 2022  11:01:30 +0000 (0:00:00.490)       0:00:23.277 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "__pvs_lvm_len": "1"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:23
Wednesday 06 July 2022  11:01:30 +0000 (0:00:00.060)       0:00:23.337 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_pool_pvs": [
            "/dev/sda"
        ]
    },
    "changed": false
}

TASK [Verify PV count] *********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:27
Wednesday 06 July 2022  11:01:30 +0000 (0:00:00.055)       0:00:23.393 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:34
Wednesday 06 July 2022  11:01:30 +0000 (0:00:00.060)       0:00:23.453 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_type": "disk"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:38
Wednesday 06 July 2022  11:01:30 +0000 (0:00:00.037)       0:00:23.491 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_type": "disk"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:42
Wednesday 06 July 2022  11:01:30 +0000 (0:00:00.049)       0:00:23.540 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check the type of each PV] ***********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:46
Wednesday 06 July 2022  11:01:30 +0000 (0:00:00.024)       0:00:23.564 ******** 
ok: [/cache/fedora-35.qcow2.snap] => (item=/dev/sda) => {
    "ansible_loop_var": "pv",
    "changed": false,
    "pv": "/dev/sda"
}

MSG:

All assertions passed

TASK [Check MD RAID] ***********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:56
Wednesday 06 July 2022  11:01:30 +0000 (0:00:00.044)       0:00:23.608 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-md.yml for /cache/fedora-35.qcow2.snap

TASK [get information about RAID] **********************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:6
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.043)       0:00:23.652 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:12
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.026)       0:00:23.678 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:16
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.025)       0:00:23.704 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:20
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.028)       0:00:23.733 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:24
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.026)       0:00:23.760 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:30
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.039)       0:00:23.800 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:36
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.026)       0:00:23.826 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:44
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.027)       0:00:23.853 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_md_active_devices_re": null,
        "storage_test_md_metadata_version_re": null,
        "storage_test_md_spare_devices_re": null
    },
    "changed": false
}

TASK [Check LVM RAID] **********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:59
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.037)       0:00:23.891 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-members-lvmraid.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member LVM RAID settings] **********************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-lvmraid.yml:1
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.045)       0:00:23.937 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-member-lvmraid.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [Get information about LVM RAID] ******************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-lvmraid.yml:3
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.045)       0:00:23.982 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is LVM RAID] *******************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-lvmraid.yml:8
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.026)       0:00:24.008 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-lvmraid.yml:12
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.026)       0:00:24.035 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check Thin Pools] ********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:62
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.028)       0:00:24.063 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-members-thin.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member thinpool settings] **********************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-thin.yml:1
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.081)       0:00:24.145 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-member-thin.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [Get information about thinpool] ******************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-thin.yml:3
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.047)       0:00:24.192 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is in correct thinpool (when thinp name is provided)] ***
task path: /tmp/tmprua6lrek/tests/verify-pool-member-thin.yml:8
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.029)       0:00:24.222 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is in thinpool (when thinp name is not provided)] ******
task path: /tmp/tmprua6lrek/tests/verify-pool-member-thin.yml:13
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.024)       0:00:24.247 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-thin.yml:17
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.026)       0:00:24.273 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check member encryption] *************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:65
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.025)       0:00:24.299 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-members-encryption.yml for /cache/fedora-35.qcow2.snap

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-encryption.yml:4
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.049)       0:00:24.349 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Validate pool member LUKS settings] **************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-encryption.yml:8
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.052)       0:00:24.401 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => (item=/dev/sda)  => {
    "_storage_test_pool_member_path": "/dev/sda",
    "ansible_loop_var": "_storage_test_pool_member_path",
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Validate pool member crypttab entries] ***********************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-encryption.yml:15
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.030)       0:00:24.432 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-member-crypttab.yml for /cache/fedora-35.qcow2.snap => (item=/dev/sda)

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-crypttab.yml:1
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.043)       0:00:24.476 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": []
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-crypttab.yml:6
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.053)       0:00:24.529 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-crypttab.yml:11
Wednesday 06 July 2022  11:01:31 +0000 (0:00:00.058)       0:00:24.588 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-crypttab.yml:17
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.037)       0:00:24.625 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-crypttab.yml:23
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.037)       0:00:24.663 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-crypttab.yml:29
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.042)       0:00:24.706 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-encryption.yml:22
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.038)       0:00:24.745 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_crypttab_key_file": null
    },
    "changed": false
}

TASK [Check VDO] ***************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:68
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.035)       0:00:24.780 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-members-vdo.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member VDO settings] ***************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-vdo.yml:1
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.050)       0:00:24.830 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [get information about VDO deduplication] *********************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:3
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.047)       0:00:24.878 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:8
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.025)       0:00:24.903 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:11
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.026)       0:00:24.929 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:16
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.026)       0:00:24.956 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:21
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.025)       0:00:24.981 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:24
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.026)       0:00:25.008 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:29
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.028)       0:00:25.037 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:39
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.027)       0:00:25.065 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_vdo_status": null
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:71
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.041)       0:00:25.106 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "__pvs_lvm_len": null,
        "_storage_test_expected_pv_count": null,
        "_storage_test_expected_pv_type": null,
        "_storage_test_pool_pvs": [],
        "_storage_test_pool_pvs_lvm": []
    },
    "changed": false
}

TASK [verify the volumes] ******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-volumes.yml:3
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.037)       0:00:25.144 ******** 
included: /tmp/tmprua6lrek/tests/test-verify-volume.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume.yml:2
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.047)       0:00:25.191 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": true,
        "_storage_volume_tests": [
            "mount",
            "fstab",
            "fs",
            "device",
            "encryption",
            "md",
            "size",
            "cache"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume.yml:10
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.049)       0:00:25.241 ******** 
included: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml for /cache/fedora-35.qcow2.snap => (item=mount)
included: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml for /cache/fedora-35.qcow2.snap => (item=fstab)
included: /tmp/tmprua6lrek/tests/test-verify-volume-fs.yml for /cache/fedora-35.qcow2.snap => (item=fs)
included: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml for /cache/fedora-35.qcow2.snap => (item=device)
included: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml for /cache/fedora-35.qcow2.snap => (item=encryption)
included: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml for /cache/fedora-35.qcow2.snap => (item=md)
included: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml for /cache/fedora-35.qcow2.snap => (item=size)
included: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml for /cache/fedora-35.qcow2.snap => (item=cache)

TASK [Get expected mount device based on device type] **************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:6
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.117)       0:00:25.358 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_device_path": "/dev/mapper/foo-test1"
    },
    "changed": false
}

TASK [Set some facts] **********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:14
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.041)       0:00:25.400 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": [
            {
                "block_available": 770146,
                "block_size": 4096,
                "block_total": 783872,
                "block_used": 13726,
                "device": "/dev/mapper/foo-test1",
                "fstype": "xfs",
                "inode_available": 1572861,
                "inode_total": 1572864,
                "inode_used": 3,
                "mount": "/opt/test1",
                "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota",
                "size_available": 3154518016,
                "size_total": 3210739712,
                "uuid": "e363aca8-4773-4789-a338-0cadc3e349f5"
            }
        ],
        "storage_test_mount_expected_match_count": "1",
        "storage_test_mount_point_matches": [
            {
                "block_available": 770146,
                "block_size": 4096,
                "block_total": 783872,
                "block_used": 13726,
                "device": "/dev/mapper/foo-test1",
                "fstype": "xfs",
                "inode_available": 1572861,
                "inode_total": 1572864,
                "inode_used": 3,
                "mount": "/opt/test1",
                "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota",
                "size_available": 3154518016,
                "size_total": 3210739712,
                "uuid": "e363aca8-4773-4789-a338-0cadc3e349f5"
            }
        ],
        "storage_test_swap_expected_matches": "0"
    },
    "changed": false
}

TASK [Verify the current mount state by device] ********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:28
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.054)       0:00:25.454 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the current mount state by mount point] ***************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:37
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.048)       0:00:25.502 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the mount fs type] ************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:45
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.045)       0:00:25.548 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [command] *****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:54
Wednesday 06 July 2022  11:01:32 +0000 (0:00:00.047)       0:00:25.595 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Gather swap info] ********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:58
Wednesday 06 July 2022  11:01:33 +0000 (0:00:00.023)       0:00:25.618 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify swap status] ******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:63
Wednesday 06 July 2022  11:01:33 +0000 (0:00:00.024)       0:00:25.643 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Unset facts] *************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:75
Wednesday 06 July 2022  11:01:33 +0000 (0:00:00.022)       0:00:25.666 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": null,
        "storage_test_mount_expected_match_count": null,
        "storage_test_mount_point_matches": null,
        "storage_test_swap_expected_matches": null,
        "storage_test_swaps": null,
        "storage_test_sys_node": null
    },
    "changed": false
}

TASK [Set some variables for fstab checking] ***********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml:2
Wednesday 06 July 2022  11:01:33 +0000 (0:00:00.034)       0:00:25.700 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": "1",
        "storage_test_fstab_expected_mount_options_matches": "1",
        "storage_test_fstab_expected_mount_point_matches": "1",
        "storage_test_fstab_id_matches": [
            "/dev/mapper/foo-test1 "
        ],
        "storage_test_fstab_mount_options_matches": [
            " /opt/test1 xfs defaults "
        ],
        "storage_test_fstab_mount_point_matches": [
            " /opt/test1 "
        ]
    },
    "changed": false
}

TASK [Verify that the device identifier appears in /etc/fstab] *****************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml:25
Wednesday 06 July 2022  11:01:33 +0000 (0:00:00.062)       0:00:25.762 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the fstab mount point] ********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml:32
Wednesday 06 July 2022  11:01:33 +0000 (0:00:00.048)       0:00:25.811 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify mount_options] ****************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml:39
Wednesday 06 July 2022  11:01:33 +0000 (0:00:00.047)       0:00:25.859 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up variables] ******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml:49
Wednesday 06 July 2022  11:01:33 +0000 (0:00:00.036)       0:00:25.895 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": null,
        "storage_test_fstab_expected_mount_options_matches": null,
        "storage_test_fstab_expected_mount_point_matches": null,
        "storage_test_fstab_id_matches": null,
        "storage_test_fstab_mount_options_matches": null,
        "storage_test_fstab_mount_point_matches": null
    },
    "changed": false
}

TASK [Verify fs type] **********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fs.yml:4
Wednesday 06 July 2022  11:01:33 +0000 (0:00:00.039)       0:00:25.934 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify fs label] *********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fs.yml:10
Wednesday 06 July 2022  11:01:33 +0000 (0:00:00.038)       0:00:25.973 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [See whether the device node is present] **********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:4
Wednesday 06 July 2022  11:01:33 +0000 (0:00:00.042)       0:00:26.015 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1657105285.4657035,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "ctime": 1657105283.3717036,
        "dev": 5,
        "device_type": 64768,
        "executable": false,
        "exists": true,
        "gid": 6,
        "gr_name": "disk",
        "inode": 505,
        "isblk": true,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": false,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/symlink",
        "mode": "0660",
        "mtime": 1657105283.3717036,
        "nlink": 1,
        "path": "/dev/mapper/foo-test1",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": null,
        "wgrp": true,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [Verify the presence/absence of the device node] **************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:10
Wednesday 06 July 2022  11:01:33 +0000 (0:00:00.412)       0:00:26.427 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about this volume] *********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:18
Wednesday 06 July 2022  11:01:33 +0000 (0:00:00.039)       0:00:26.466 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [(1/2) Process volume type (set initial value)] ***************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:24
Wednesday 06 July 2022  11:01:33 +0000 (0:00:00.040)       0:00:26.507 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "st_volume_type": "lvm"
    },
    "changed": false
}

TASK [(2/2) Process volume type (get RAID value)] ******************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:28
Wednesday 06 July 2022  11:01:33 +0000 (0:00:00.038)       0:00:26.546 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the volume's device type] *****************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:33
Wednesday 06 July 2022  11:01:33 +0000 (0:00:00.025)       0:00:26.572 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Stat the LUKS device, if encrypted] **************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:3
Wednesday 06 July 2022  11:01:33 +0000 (0:00:00.036)       0:00:26.608 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Ensure cryptsetup is present] ********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:10
Wednesday 06 July 2022  11:01:34 +0000 (0:00:00.023)       0:00:26.632 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [Collect LUKS info for this volume] ***************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:15
Wednesday 06 July 2022  11:01:35 +0000 (0:00:01.967)       0:00:28.599 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the presence/absence of the LUKS device node] *********************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:21
Wednesday 06 July 2022  11:01:36 +0000 (0:00:00.026)       0:00:28.626 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify that the raw device is the same as the device if not encrypted] ***
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:30
Wednesday 06 July 2022  11:01:36 +0000 (0:00:00.025)       0:00:28.652 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about the LUKS volume if encrypted] ****************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:38
Wednesday 06 July 2022  11:01:36 +0000 (0:00:00.054)       0:00:28.706 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the LUKS volume's device type if encrypted] ***********************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:44
Wednesday 06 July 2022  11:01:36 +0000 (0:00:00.024)       0:00:28.731 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS version] ******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:49
Wednesday 06 July 2022  11:01:36 +0000 (0:00:00.023)       0:00:28.754 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS key size] *****************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:55
Wednesday 06 July 2022  11:01:36 +0000 (0:00:00.024)       0:00:28.779 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS cipher] *******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:61
Wednesday 06 July 2022  11:01:36 +0000 (0:00:00.029)       0:00:28.809 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:67
Wednesday 06 July 2022  11:01:36 +0000 (0:00:00.054)       0:00:28.863 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": [],
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:74
Wednesday 06 July 2022  11:01:36 +0000 (0:00:00.050)       0:00:28.914 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:79
Wednesday 06 July 2022  11:01:36 +0000 (0:00:00.052)       0:00:28.967 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:85
Wednesday 06 July 2022  11:01:36 +0000 (0:00:00.041)       0:00:29.008 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:91
Wednesday 06 July 2022  11:01:36 +0000 (0:00:00.037)       0:00:29.046 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:97
Wednesday 06 July 2022  11:01:36 +0000 (0:00:00.036)       0:00:29.083 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_expected_crypttab_entries": null,
        "_storage_test_expected_crypttab_key_file": null
    },
    "changed": false
}

TASK [get information about RAID] **********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:7
Wednesday 06 July 2022  11:01:36 +0000 (0:00:00.039)       0:00:29.122 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:13
Wednesday 06 July 2022  11:01:36 +0000 (0:00:00.045)       0:00:29.168 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:17
Wednesday 06 July 2022  11:01:36 +0000 (0:00:00.038)       0:00:29.206 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:21
Wednesday 06 July 2022  11:01:36 +0000 (0:00:00.040)       0:00:29.246 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:25
Wednesday 06 July 2022  11:01:36 +0000 (0:00:00.040)       0:00:29.287 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:31
Wednesday 06 July 2022  11:01:36 +0000 (0:00:00.047)       0:00:29.335 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:37
Wednesday 06 July 2022  11:01:36 +0000 (0:00:00.037)       0:00:29.373 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the actual size of the volume] *************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:3
Wednesday 06 July 2022  11:01:36 +0000 (0:00:00.038)       0:00:29.412 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "bytes": 3221225472,
    "changed": false,
    "lvm": "3g",
    "parted": "3GiB",
    "size": "3 GiB"
}

TASK [parse the requested size of the volume] **********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:9
Wednesday 06 July 2022  11:01:37 +0000 (0:00:00.546)       0:00:29.958 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "bytes": 3221225472,
    "changed": false,
    "lvm": "3g",
    "parted": "3GiB",
    "size": "3 GiB"
}

TASK [Establish base value for expected size] **********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:15
Wednesday 06 July 2022  11:01:37 +0000 (0:00:00.436)       0:00:30.394 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_expected_size": "3221225472"
    },
    "changed": false
}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:20
Wednesday 06 July 2022  11:01:37 +0000 (0:00:00.052)       0:00:30.447 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_expected_size": "3221225472"
}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:25
Wednesday 06 July 2022  11:01:37 +0000 (0:00:00.039)       0:00:30.487 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:28
Wednesday 06 July 2022  11:01:37 +0000 (0:00:00.040)       0:00:30.527 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Get the size of parent/pool device] **************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:31
Wednesday 06 July 2022  11:01:37 +0000 (0:00:00.039)       0:00:30.567 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:36
Wednesday 06 July 2022  11:01:37 +0000 (0:00:00.039)       0:00:30.606 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Calculate the expected size based on pool size and percentage value] *****
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:39
Wednesday 06 July 2022  11:01:38 +0000 (0:00:00.039)       0:00:30.646 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:44
Wednesday 06 July 2022  11:01:38 +0000 (0:00:00.039)       0:00:30.685 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_actual_size": {
        "bytes": 3221225472,
        "changed": false,
        "failed": false,
        "lvm": "3g",
        "parted": "3GiB",
        "size": "3 GiB"
    }
}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:47
Wednesday 06 July 2022  11:01:38 +0000 (0:00:00.036)       0:00:30.722 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_expected_size": "3221225472"
}

TASK [assert] ******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:50
Wednesday 06 July 2022  11:01:38 +0000 (0:00:00.036)       0:00:30.759 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Get information about the LV] ********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:6
Wednesday 06 July 2022  11:01:38 +0000 (0:00:00.055)       0:00:30.814 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "lvs",
        "--noheadings",
        "--nameprefixes",
        "--units=b",
        "--nosuffix",
        "--unquoted",
        "-o",
        "name,attr,cache_total_blocks,chunk_size,segtype",
        "foo/test1"
    ],
    "delta": "0:00:00.036586",
    "end": "2022-07-06 11:01:37.967472",
    "rc": 0,
    "start": "2022-07-06 11:01:37.930886"
}

STDOUT:

  LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:14
Wednesday 06 July 2022  11:01:38 +0000 (0:00:00.433)       0:00:31.248 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_lv_segtype": [
            "linear"
        ]
    },
    "changed": false
}

TASK [check segment type] ******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:17
Wednesday 06 July 2022  11:01:38 +0000 (0:00:00.082)       0:00:31.331 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:22
Wednesday 06 July 2022  11:01:38 +0000 (0:00:00.054)       0:00:31.386 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested cache size] ******************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:26
Wednesday 06 July 2022  11:01:38 +0000 (0:00:00.037)       0:00:31.423 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:32
Wednesday 06 July 2022  11:01:38 +0000 (0:00:00.038)       0:00:31.462 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check cache size] ********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:36
Wednesday 06 July 2022  11:01:38 +0000 (0:00:00.038)       0:00:31.500 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up facts] **********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume.yml:16
Wednesday 06 July 2022  11:01:38 +0000 (0:00:00.039)       0:00:31.540 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": null
    },
    "changed": false
}

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:40
Wednesday 06 July 2022  11:01:38 +0000 (0:00:00.032)       0:00:31.573 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_pool": null
    },
    "changed": false
}

TASK [Verify the volumes with no pool were correctly managed] ******************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:47
Wednesday 06 July 2022  11:01:39 +0000 (0:00:00.070)       0:00:31.643 ******** 

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:57
Wednesday 06 July 2022  11:01:39 +0000 (0:00:00.061)       0:00:31.705 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_blkinfo": null,
        "storage_test_crypttab": null,
        "storage_test_fstab": null,
        "storage_test_volume": null
    },
    "changed": false
}

TASK [Change the mount location to ""] *****************************************
task path: /tmp/tmprua6lrek/tests/tests_remove_mount.yml:41
Wednesday 06 July 2022  11:01:39 +0000 (0:00:00.034)       0:00:31.740 ******** 

TASK [linux-system-roles.storage : set platform/version specific variables] ****
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main.yml:2
Wednesday 06 July 2022  11:01:39 +0000 (0:00:00.049)       0:00:31.789 ******** 
included: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/fedora-35.qcow2.snap

TASK [linux-system-roles.storage : Ensure ansible_facts used by role] **********
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2
Wednesday 06 July 2022  11:01:39 +0000 (0:00:00.033)       0:00:31.823 ******** 
ok: [/cache/fedora-35.qcow2.snap]

TASK [linux-system-roles.storage : Set platform/version specific variables] ****
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8
Wednesday 06 July 2022  11:01:39 +0000 (0:00:00.563)       0:00:32.387 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
ok: [/cache/fedora-35.qcow2.snap] => (item=Fedora.yml) => {
    "ansible_facts": {
        "_storage_copr_packages": [
            {
                "packages": [
                    "vdo",
                    "kmod-vdo"
                ],
                "repository": "rhawalsh/dm-vdo"
            }
        ],
        "_storage_copr_support_packages": [
            "dnf-plugins-core"
        ],
        "blivet_package_list": [
            "python3-blivet",
            "libblockdev-crypto",
            "libblockdev-dm",
            "libblockdev-lvm",
            "libblockdev-mdraid",
            "libblockdev-swap"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/vars/Fedora.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora.yml"
}
skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora_35.yml",
    "skip_reason": "Conditional result was False"
}
skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora_35.yml",
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main.yml:5
Wednesday 06 July 2022  11:01:39 +0000 (0:00:00.063)       0:00:32.450 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main.yml:9
Wednesday 06 July 2022  11:01:39 +0000 (0:00:00.037)       0:00:32.488 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : include the appropriate provider tasks] *****
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main.yml:13
Wednesday 06 July 2022  11:01:39 +0000 (0:00:00.032)       0:00:32.520 ******** 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
included: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/fedora-35.qcow2.snap

TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2
Wednesday 06 July 2022  11:01:39 +0000 (0:00:00.049)       0:00:32.570 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : make sure blivet is available] **************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7
Wednesday 06 July 2022  11:01:39 +0000 (0:00:00.022)       0:00:32.592 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : show storage_pools] *************************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14
Wednesday 06 July 2022  11:01:40 +0000 (0:00:00.029)       0:00:32.622 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_pools": [
        {
            "disks": [
                "sda"
            ],
            "name": "foo",
            "volumes": [
                {
                    "mount_point": "",
                    "name": "test1",
                    "size": "3g"
                }
            ]
        }
    ]
}

TASK [linux-system-roles.storage : show storage_volumes] ***********************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19
Wednesday 06 July 2022  11:01:40 +0000 (0:00:00.042)       0:00:32.664 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined"
}

TASK [linux-system-roles.storage : get required packages] **********************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24
Wednesday 06 July 2022  11:01:40 +0000 (0:00:00.039)       0:00:32.704 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : enable copr repositories if needed] *********
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37
Wednesday 06 July 2022  11:01:40 +0000 (0:00:00.028)       0:00:32.732 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : make sure required packages are installed] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44
Wednesday 06 July 2022  11:01:40 +0000 (0:00:00.026)       0:00:32.759 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : get service facts] **************************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51
Wednesday 06 July 2022  11:01:40 +0000 (0:00:00.026)       0:00:32.786 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58
Wednesday 06 July 2022  11:01:40 +0000 (0:00:00.029)       0:00:32.816 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_cryptsetup_services": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] *******
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71
Wednesday 06 July 2022  11:01:40 +0000 (0:00:00.060)       0:00:32.877 ******** 

TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77
Wednesday 06 July 2022  11:01:40 +0000 (0:00:00.026)       0:00:32.903 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "actions": [],
    "changed": false,
    "crypts": [],
    "leaves": [
        "/dev/sr0",
        "/dev/vda1",
        "/dev/vda2",
        "/dev/vda3",
        "/dev/vda4",
        "/dev/vda5",
        "/dev/mapper/foo-test1",
        "/dev/sdb",
        "/dev/sdc",
        "/dev/nvme0n1",
        "/dev/nvme1n1",
        "/dev/nvme2n1",
        "/dev/vdb1",
        "/dev/vdc1",
        "/dev/vdd",
        "/dev/vde",
        "/dev/vdf",
        "/dev/zram0"
    ],
    "mounts": [
        {
            "path": "/opt/test1",
            "state": "absent"
        }
    ],
    "packages": [
        "dosfstools",
        "lvm2",
        "xfsprogs",
        "e2fsprogs",
        "btrfs-progs"
    ],
    "pools": [
        {
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "name": "foo",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "state": "present",
            "type": "lvm",
            "volumes": [
                {
                    "_device": "/dev/mapper/foo-test1",
                    "_kernel_device": "/dev/dm-0",
                    "_mount_id": "/dev/mapper/foo-test1",
                    "_raw_device": "/dev/mapper/foo-test1",
                    "_raw_kernel_device": "/dev/dm-0",
                    "cache_devices": [],
                    "cache_mode": null,
                    "cache_size": 0,
                    "cached": false,
                    "compression": null,
                    "deduplication": null,
                    "disks": [
                        "sda"
                    ],
                    "encryption": false,
                    "encryption_cipher": null,
                    "encryption_key": null,
                    "encryption_key_size": null,
                    "encryption_luks_version": null,
                    "encryption_password": null,
                    "fs_create_options": "",
                    "fs_label": "",
                    "fs_overwrite_existing": true,
                    "fs_type": "xfs",
                    "mount_check": 0,
                    "mount_device_identifier": "uuid",
                    "mount_options": "defaults",
                    "mount_passno": 0,
                    "mount_point": "",
                    "name": "test1",
                    "raid_chunk_size": null,
                    "raid_device_count": null,
                    "raid_disks": [],
                    "raid_level": null,
                    "raid_metadata_version": null,
                    "raid_spare_count": null,
                    "size": "3g",
                    "state": "present",
                    "thin": false,
                    "thin_pool_name": null,
                    "thin_pool_size": null,
                    "type": "lvm",
                    "vdo_pool_size": null
                }
            ]
        }
    ],
    "volumes": []
}

TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91
Wednesday 06 July 2022  11:01:42 +0000 (0:00:01.995)       0:00:34.899 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] *****
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103
Wednesday 06 July 2022  11:01:42 +0000 (0:00:00.037)       0:00:34.937 ******** 

TASK [linux-system-roles.storage : show blivet_output] *************************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109
Wednesday 06 July 2022  11:01:42 +0000 (0:00:00.022)       0:00:34.959 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "blivet_output": {
        "actions": [],
        "changed": false,
        "crypts": [],
        "failed": false,
        "leaves": [
            "/dev/sr0",
            "/dev/vda1",
            "/dev/vda2",
            "/dev/vda3",
            "/dev/vda4",
            "/dev/vda5",
            "/dev/mapper/foo-test1",
            "/dev/sdb",
            "/dev/sdc",
            "/dev/nvme0n1",
            "/dev/nvme1n1",
            "/dev/nvme2n1",
            "/dev/vdb1",
            "/dev/vdc1",
            "/dev/vdd",
            "/dev/vde",
            "/dev/vdf",
            "/dev/zram0"
        ],
        "mounts": [
            {
                "path": "/opt/test1",
                "state": "absent"
            }
        ],
        "packages": [
            "dosfstools",
            "lvm2",
            "xfsprogs",
            "e2fsprogs",
            "btrfs-progs"
        ],
        "pools": [
            {
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "name": "foo",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "state": "present",
                "type": "lvm",
                "volumes": [
                    {
                        "_device": "/dev/mapper/foo-test1",
                        "_kernel_device": "/dev/dm-0",
                        "_mount_id": "/dev/mapper/foo-test1",
                        "_raw_device": "/dev/mapper/foo-test1",
                        "_raw_kernel_device": "/dev/dm-0",
                        "cache_devices": [],
                        "cache_mode": null,
                        "cache_size": 0,
                        "cached": false,
                        "compression": null,
                        "deduplication": null,
                        "disks": [
                            "sda"
                        ],
                        "encryption": false,
                        "encryption_cipher": null,
                        "encryption_key": null,
                        "encryption_key_size": null,
                        "encryption_luks_version": null,
                        "encryption_password": null,
                        "fs_create_options": "",
                        "fs_label": "",
                        "fs_overwrite_existing": true,
                        "fs_type": "xfs",
                        "mount_check": 0,
                        "mount_device_identifier": "uuid",
                        "mount_options": "defaults",
                        "mount_passno": 0,
                        "mount_point": "",
                        "name": "test1",
                        "raid_chunk_size": null,
                        "raid_device_count": null,
                        "raid_disks": [],
                        "raid_level": null,
                        "raid_metadata_version": null,
                        "raid_spare_count": null,
                        "size": "3g",
                        "state": "present",
                        "thin": false,
                        "thin_pool_name": null,
                        "thin_pool_size": null,
                        "type": "lvm",
                        "vdo_pool_size": null
                    }
                ]
            }
        ],
        "volumes": []
    }
}

TASK [linux-system-roles.storage : set the list of pools for test verification] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114
Wednesday 06 July 2022  11:01:42 +0000 (0:00:00.039)       0:00:34.998 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": [
            {
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "name": "foo",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "state": "present",
                "type": "lvm",
                "volumes": [
                    {
                        "_device": "/dev/mapper/foo-test1",
                        "_kernel_device": "/dev/dm-0",
                        "_mount_id": "/dev/mapper/foo-test1",
                        "_raw_device": "/dev/mapper/foo-test1",
                        "_raw_kernel_device": "/dev/dm-0",
                        "cache_devices": [],
                        "cache_mode": null,
                        "cache_size": 0,
                        "cached": false,
                        "compression": null,
                        "deduplication": null,
                        "disks": [
                            "sda"
                        ],
                        "encryption": false,
                        "encryption_cipher": null,
                        "encryption_key": null,
                        "encryption_key_size": null,
                        "encryption_luks_version": null,
                        "encryption_password": null,
                        "fs_create_options": "",
                        "fs_label": "",
                        "fs_overwrite_existing": true,
                        "fs_type": "xfs",
                        "mount_check": 0,
                        "mount_device_identifier": "uuid",
                        "mount_options": "defaults",
                        "mount_passno": 0,
                        "mount_point": "",
                        "name": "test1",
                        "raid_chunk_size": null,
                        "raid_device_count": null,
                        "raid_disks": [],
                        "raid_level": null,
                        "raid_metadata_version": null,
                        "raid_spare_count": null,
                        "size": "3g",
                        "state": "present",
                        "thin": false,
                        "thin_pool_name": null,
                        "thin_pool_size": null,
                        "type": "lvm",
                        "vdo_pool_size": null
                    }
                ]
            }
        ]
    },
    "changed": false
}

TASK [linux-system-roles.storage : set the list of volumes for test verification] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118
Wednesday 06 July 2022  11:01:42 +0000 (0:00:00.043)       0:00:35.042 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : remove obsolete mounts] *********************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134
Wednesday 06 July 2022  11:01:42 +0000 (0:00:00.039)       0:00:35.081 ******** 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
changed: [/cache/fedora-35.qcow2.snap] => (item={'path': '/opt/test1', 'state': 'absent'}) => {
    "ansible_loop_var": "mount_info",
    "backup_file": "",
    "boot": "yes",
    "changed": true,
    "dump": "0",
    "fstab": "/etc/fstab",
    "mount_info": {
        "path": "/opt/test1",
        "state": "absent"
    },
    "name": "/opt/test1",
    "opts": "defaults",
    "passno": "0"
}

TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146
Wednesday 06 July 2022  11:01:42 +0000 (0:00:00.430)       0:00:35.512 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [linux-system-roles.storage : set up new/current mounts] ******************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151
Wednesday 06 July 2022  11:01:43 +0000 (0:00:00.725)       0:00:36.238 ******** 

TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163
Wednesday 06 July 2022  11:01:43 +0000 (0:00:00.039)       0:00:36.277 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171
Wednesday 06 July 2022  11:01:44 +0000 (0:00:00.715)       0:00:36.993 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1657093385.4860332,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709",
        "ctime": 1657005647.423,
        "dev": 31,
        "device_type": 0,
        "executable": false,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 267,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/x-empty",
        "mode": "0600",
        "mtime": 1657005500.596,
        "nlink": 1,
        "path": "/etc/crypttab",
        "pw_name": "root",
        "readable": true,
        "rgrp": false,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": "10",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176
Wednesday 06 July 2022  11:01:44 +0000 (0:00:00.394)       0:00:37.388 ******** 

TASK [linux-system-roles.storage : Update facts] *******************************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198
Wednesday 06 July 2022  11:01:44 +0000 (0:00:00.024)       0:00:37.412 ******** 
ok: [/cache/fedora-35.qcow2.snap]
META: role_complete for /cache/fedora-35.qcow2.snap

TASK [include_tasks] ***********************************************************
task path: /tmp/tmprua6lrek/tests/tests_remove_mount.yml:53
Wednesday 06 July 2022  11:01:45 +0000 (0:00:00.955)       0:00:38.367 ******** 
included: /tmp/tmprua6lrek/tests/verify-role-results.yml for /cache/fedora-35.qcow2.snap

TASK [Print out pool information] **********************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:1
Wednesday 06 July 2022  11:01:45 +0000 (0:00:00.044)       0:00:38.412 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "_storage_pools_list": [
        {
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "name": "foo",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "state": "present",
            "type": "lvm",
            "volumes": [
                {
                    "_device": "/dev/mapper/foo-test1",
                    "_kernel_device": "/dev/dm-0",
                    "_mount_id": "/dev/mapper/foo-test1",
                    "_raw_device": "/dev/mapper/foo-test1",
                    "_raw_kernel_device": "/dev/dm-0",
                    "cache_devices": [],
                    "cache_mode": null,
                    "cache_size": 0,
                    "cached": false,
                    "compression": null,
                    "deduplication": null,
                    "disks": [
                        "sda"
                    ],
                    "encryption": false,
                    "encryption_cipher": null,
                    "encryption_key": null,
                    "encryption_key_size": null,
                    "encryption_luks_version": null,
                    "encryption_password": null,
                    "fs_create_options": "",
                    "fs_label": "",
                    "fs_overwrite_existing": true,
                    "fs_type": "xfs",
                    "mount_check": 0,
                    "mount_device_identifier": "uuid",
                    "mount_options": "defaults",
                    "mount_passno": 0,
                    "mount_point": "",
                    "name": "test1",
                    "raid_chunk_size": null,
                    "raid_device_count": null,
                    "raid_disks": [],
                    "raid_level": null,
                    "raid_metadata_version": null,
                    "raid_spare_count": null,
                    "size": "3g",
                    "state": "present",
                    "thin": false,
                    "thin_pool_name": null,
                    "thin_pool_size": null,
                    "type": "lvm",
                    "vdo_pool_size": null
                }
            ]
        }
    ]
}

TASK [Print out volume information] ********************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:6
Wednesday 06 July 2022  11:01:45 +0000 (0:00:00.051)       0:00:38.464 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Collect info about the volumes.] *****************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:14
Wednesday 06 July 2022  11:01:45 +0000 (0:00:00.038)       0:00:38.502 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "info": {
        "/dev/mapper/foo-test1": {
            "fstype": "xfs",
            "label": "",
            "name": "/dev/mapper/foo-test1",
            "size": "3G",
            "type": "lvm",
            "uuid": "e363aca8-4773-4789-a338-0cadc3e349f5"
        },
        "/dev/nvme0n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme0n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme1n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme1n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme2n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme2n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sda": {
            "fstype": "LVM2_member",
            "label": "",
            "name": "/dev/sda",
            "size": "10G",
            "type": "disk",
            "uuid": "jTr21u-XFvQ-FDKv-U00Y-g4iZ-7hhe-ySdWgB"
        },
        "/dev/sdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdb",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdc",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sr0": {
            "fstype": "iso9660",
            "label": "cidata",
            "name": "/dev/sr0",
            "size": "364K",
            "type": "rom",
            "uuid": "2022-07-06-11-00-54-00"
        },
        "/dev/vda": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda",
            "size": "4G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vda1": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda1",
            "size": "1M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda2": {
            "fstype": "ext4",
            "label": "boot",
            "name": "/dev/vda2",
            "size": "500M",
            "type": "partition",
            "uuid": "5f2f82d0-ae0a-4574-8811-62a31a51a870"
        },
        "/dev/vda3": {
            "fstype": "vfat",
            "label": "",
            "name": "/dev/vda3",
            "size": "100M",
            "type": "partition",
            "uuid": "5B84-6DD7"
        },
        "/dev/vda4": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda4",
            "size": "4M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda5": {
            "fstype": "btrfs",
            "label": "fedora",
            "name": "/dev/vda5",
            "size": "3.4G",
            "type": "partition",
            "uuid": "fbdaf05f-1a41-4dc5-b56e-a10edb430f9a"
        },
        "/dev/vdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdb",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdb1": {
            "fstype": "ext4",
            "label": "yumcache",
            "name": "/dev/vdb1",
            "size": "2G",
            "type": "partition",
            "uuid": "e676dfc5-3e4b-4331-8ede-73c3f56d2cab"
        },
        "/dev/vdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdc",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdc1": {
            "fstype": "ext4",
            "label": "yumvarlib",
            "name": "/dev/vdc1",
            "size": "2G",
            "type": "partition",
            "uuid": "0c299eb4-81f5-4414-b246-b95738eb82f0"
        },
        "/dev/vdd": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdd",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vde": {
            "fstype": "",
            "label": "",
            "name": "/dev/vde",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdf": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdf",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/zram0": {
            "fstype": "",
            "label": "",
            "name": "/dev/zram0",
            "size": "1.9G",
            "type": "disk",
            "uuid": ""
        }
    }
}

TASK [Read the /etc/fstab file for volume existence] ***************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:19
Wednesday 06 July 2022  11:01:46 +0000 (0:00:00.450)       0:00:38.952 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/fstab"
    ],
    "delta": "0:00:00.002963",
    "end": "2022-07-06 11:01:46.063457",
    "rc": 0,
    "start": "2022-07-06 11:01:46.060494"
}

STDOUT:


#
# /etc/fstab
# Created by anaconda on Tue Jul  5 07:18:20 2022
#
# Accessible filesystems, by reference, are maintained under '/dev/disk/'.
# See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info.
#
# After editing this file, run 'systemctl daemon-reload' to update systemd
# units generated from this file.
#
UUID=fbdaf05f-1a41-4dc5-b56e-a10edb430f9a /                       btrfs   subvol=root,compress=zstd:1 0 0
UUID=5f2f82d0-ae0a-4574-8811-62a31a51a870 /boot                   ext4    defaults        1 2
UUID=5B84-6DD7          /boot/efi               vfat    defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2
UUID=fbdaf05f-1a41-4dc5-b56e-a10edb430f9a /home                   btrfs   subvol=home,compress=zstd:1 0 0
/dev/vdb1	/var/cache/dnf	auto	defaults,nofail,comment=cloudconfig	0	2
/dev/vdc1	/var/lib/dnf	auto	defaults,nofail,comment=cloudconfig	0	2

TASK [Read the /etc/crypttab file] *********************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:24
Wednesday 06 July 2022  11:01:46 +0000 (0:00:00.392)       0:00:39.345 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/crypttab"
    ],
    "delta": "0:00:00.003662",
    "end": "2022-07-06 11:01:46.475776",
    "failed_when_result": false,
    "rc": 0,
    "start": "2022-07-06 11:01:46.472114"
}

TASK [Verify the volumes listed in storage_pools were correctly managed] *******
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:33
Wednesday 06 July 2022  11:01:47 +0000 (0:00:00.416)       0:00:39.762 ******** 
[WARNING]: The loop variable 'storage_test_pool' is already in use. You should
set the `loop_var` value in the `loop_control` option for the task to something
else to avoid variable collisions and unexpected behavior.
included: /tmp/tmprua6lrek/tests/test-verify-pool.yml for /cache/fedora-35.qcow2.snap => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}], 'raid_chunk_size': None})

TASK [Set _storage_pool_tests] *************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool.yml:5
Wednesday 06 July 2022  11:01:47 +0000 (0:00:00.061)       0:00:39.823 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pool_tests": [
            "members",
            "volumes"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool.yml:18
Wednesday 06 July 2022  11:01:47 +0000 (0:00:00.035)       0:00:39.859 ******** 
included: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml for /cache/fedora-35.qcow2.snap => (item=members)
included: /tmp/tmprua6lrek/tests/test-verify-pool-volumes.yml for /cache/fedora-35.qcow2.snap => (item=volumes)

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:1
Wednesday 06 July 2022  11:01:47 +0000 (0:00:00.047)       0:00:39.907 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_count": "1",
        "_storage_test_pool_pvs_lvm": [
            "/dev/sda"
        ]
    },
    "changed": false
}

TASK [Get the canonical device path for each member device] ********************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:10
Wednesday 06 July 2022  11:01:47 +0000 (0:00:00.053)       0:00:39.960 ******** 
ok: [/cache/fedora-35.qcow2.snap] => (item=/dev/sda) => {
    "ansible_loop_var": "pv",
    "changed": false,
    "device": "/dev/sda",
    "pv": "/dev/sda"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:19
Wednesday 06 July 2022  11:01:47 +0000 (0:00:00.404)       0:00:40.364 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "__pvs_lvm_len": "1"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:23
Wednesday 06 July 2022  11:01:47 +0000 (0:00:00.047)       0:00:40.412 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_pool_pvs": [
            "/dev/sda"
        ]
    },
    "changed": false
}

TASK [Verify PV count] *********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:27
Wednesday 06 July 2022  11:01:47 +0000 (0:00:00.048)       0:00:40.461 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:34
Wednesday 06 July 2022  11:01:47 +0000 (0:00:00.050)       0:00:40.511 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_type": "disk"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:38
Wednesday 06 July 2022  11:01:47 +0000 (0:00:00.039)       0:00:40.551 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_type": "disk"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:42
Wednesday 06 July 2022  11:01:47 +0000 (0:00:00.051)       0:00:40.603 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check the type of each PV] ***********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:46
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.026)       0:00:40.629 ******** 
ok: [/cache/fedora-35.qcow2.snap] => (item=/dev/sda) => {
    "ansible_loop_var": "pv",
    "changed": false,
    "pv": "/dev/sda"
}

MSG:

All assertions passed

TASK [Check MD RAID] ***********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:56
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.042)       0:00:40.671 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-md.yml for /cache/fedora-35.qcow2.snap

TASK [get information about RAID] **********************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:6
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.047)       0:00:40.719 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:12
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.026)       0:00:40.746 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:16
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.025)       0:00:40.771 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:20
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.024)       0:00:40.796 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:24
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.024)       0:00:40.820 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:30
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.054)       0:00:40.875 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:36
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.024)       0:00:40.899 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:44
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.023)       0:00:40.922 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_md_active_devices_re": null,
        "storage_test_md_metadata_version_re": null,
        "storage_test_md_spare_devices_re": null
    },
    "changed": false
}

TASK [Check LVM RAID] **********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:59
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.034)       0:00:40.957 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-members-lvmraid.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member LVM RAID settings] **********************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-lvmraid.yml:1
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.043)       0:00:41.000 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-member-lvmraid.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [Get information about LVM RAID] ******************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-lvmraid.yml:3
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.042)       0:00:41.043 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is LVM RAID] *******************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-lvmraid.yml:8
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.029)       0:00:41.072 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-lvmraid.yml:12
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.027)       0:00:41.100 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check Thin Pools] ********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:62
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.029)       0:00:41.129 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-members-thin.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member thinpool settings] **********************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-thin.yml:1
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.046)       0:00:41.176 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-member-thin.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [Get information about thinpool] ******************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-thin.yml:3
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.045)       0:00:41.221 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is in correct thinpool (when thinp name is provided)] ***
task path: /tmp/tmprua6lrek/tests/verify-pool-member-thin.yml:8
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.025)       0:00:41.247 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is in thinpool (when thinp name is not provided)] ******
task path: /tmp/tmprua6lrek/tests/verify-pool-member-thin.yml:13
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.025)       0:00:41.272 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-thin.yml:17
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.026)       0:00:41.299 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check member encryption] *************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:65
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.024)       0:00:41.324 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-members-encryption.yml for /cache/fedora-35.qcow2.snap

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-encryption.yml:4
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.049)       0:00:41.373 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Validate pool member LUKS settings] **************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-encryption.yml:8
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.050)       0:00:41.423 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => (item=/dev/sda)  => {
    "_storage_test_pool_member_path": "/dev/sda",
    "ansible_loop_var": "_storage_test_pool_member_path",
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Validate pool member crypttab entries] ***********************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-encryption.yml:15
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.029)       0:00:41.453 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-member-crypttab.yml for /cache/fedora-35.qcow2.snap => (item=/dev/sda)

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-crypttab.yml:1
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.051)       0:00:41.504 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": []
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-crypttab.yml:6
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.050)       0:00:41.554 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-crypttab.yml:11
Wednesday 06 July 2022  11:01:48 +0000 (0:00:00.060)       0:00:41.614 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-crypttab.yml:17
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.036)       0:00:41.651 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-crypttab.yml:23
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.035)       0:00:41.686 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-crypttab.yml:29
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.036)       0:00:41.723 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-encryption.yml:22
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.039)       0:00:41.763 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_crypttab_key_file": null
    },
    "changed": false
}

TASK [Check VDO] ***************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:68
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.035)       0:00:41.798 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-members-vdo.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member VDO settings] ***************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-vdo.yml:1
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.050)       0:00:41.849 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [get information about VDO deduplication] *********************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:3
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.045)       0:00:41.894 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:8
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.025)       0:00:41.919 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:11
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.026)       0:00:41.946 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:16
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.026)       0:00:41.972 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:21
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.025)       0:00:41.998 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:24
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.024)       0:00:42.023 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:29
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.028)       0:00:42.051 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:39
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.058)       0:00:42.109 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_vdo_status": null
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:71
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.034)       0:00:42.144 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "__pvs_lvm_len": null,
        "_storage_test_expected_pv_count": null,
        "_storage_test_expected_pv_type": null,
        "_storage_test_pool_pvs": [],
        "_storage_test_pool_pvs_lvm": []
    },
    "changed": false
}

TASK [verify the volumes] ******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-volumes.yml:3
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.034)       0:00:42.179 ******** 
[WARNING]: The loop variable 'storage_test_volume' is already in use. You
should set the `loop_var` value in the `loop_control` option for the task to
something else to avoid variable collisions and unexpected behavior.
included: /tmp/tmprua6lrek/tests/test-verify-volume.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume.yml:2
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.044)       0:00:42.223 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": true,
        "_storage_volume_tests": [
            "mount",
            "fstab",
            "fs",
            "device",
            "encryption",
            "md",
            "size",
            "cache"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume.yml:10
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.053)       0:00:42.276 ******** 
included: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml for /cache/fedora-35.qcow2.snap => (item=mount)
included: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml for /cache/fedora-35.qcow2.snap => (item=fstab)
included: /tmp/tmprua6lrek/tests/test-verify-volume-fs.yml for /cache/fedora-35.qcow2.snap => (item=fs)
included: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml for /cache/fedora-35.qcow2.snap => (item=device)
included: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml for /cache/fedora-35.qcow2.snap => (item=encryption)
included: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml for /cache/fedora-35.qcow2.snap => (item=md)
included: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml for /cache/fedora-35.qcow2.snap => (item=size)
included: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml for /cache/fedora-35.qcow2.snap => (item=cache)

TASK [Get expected mount device based on device type] **************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:6
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.086)       0:00:42.363 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_device_path": "/dev/mapper/foo-test1"
    },
    "changed": false
}

TASK [Set some facts] **********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:14
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.045)       0:00:42.408 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": [],
        "storage_test_mount_expected_match_count": "0",
        "storage_test_mount_point_matches": [],
        "storage_test_swap_expected_matches": "0"
    },
    "changed": false
}

TASK [Verify the current mount state by device] ********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:28
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.059)       0:00:42.468 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the current mount state by mount point] ***************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:37
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.027)       0:00:42.496 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the mount fs type] ************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:45
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.049)       0:00:42.545 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [command] *****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:54
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.040)       0:00:42.585 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Gather swap info] ********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:58
Wednesday 06 July 2022  11:01:49 +0000 (0:00:00.024)       0:00:42.610 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify swap status] ******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:63
Wednesday 06 July 2022  11:01:50 +0000 (0:00:00.023)       0:00:42.634 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Unset facts] *************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:75
Wednesday 06 July 2022  11:01:50 +0000 (0:00:00.025)       0:00:42.659 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": null,
        "storage_test_mount_expected_match_count": null,
        "storage_test_mount_point_matches": null,
        "storage_test_swap_expected_matches": null,
        "storage_test_swaps": null,
        "storage_test_sys_node": null
    },
    "changed": false
}

TASK [Set some variables for fstab checking] ***********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml:2
Wednesday 06 July 2022  11:01:50 +0000 (0:00:00.040)       0:00:42.700 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": "0",
        "storage_test_fstab_expected_mount_options_matches": "1",
        "storage_test_fstab_expected_mount_point_matches": "0",
        "storage_test_fstab_id_matches": [],
        "storage_test_fstab_mount_options_matches": [],
        "storage_test_fstab_mount_point_matches": []
    },
    "changed": false
}

TASK [Verify that the device identifier appears in /etc/fstab] *****************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml:25
Wednesday 06 July 2022  11:01:50 +0000 (0:00:00.066)       0:00:42.766 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the fstab mount point] ********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml:32
Wednesday 06 July 2022  11:01:50 +0000 (0:00:00.051)       0:00:42.818 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify mount_options] ****************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml:39
Wednesday 06 July 2022  11:01:50 +0000 (0:00:00.053)       0:00:42.871 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up variables] ******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml:49
Wednesday 06 July 2022  11:01:50 +0000 (0:00:00.035)       0:00:42.907 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": null,
        "storage_test_fstab_expected_mount_options_matches": null,
        "storage_test_fstab_expected_mount_point_matches": null,
        "storage_test_fstab_id_matches": null,
        "storage_test_fstab_mount_options_matches": null,
        "storage_test_fstab_mount_point_matches": null
    },
    "changed": false
}

TASK [Verify fs type] **********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fs.yml:4
Wednesday 06 July 2022  11:01:50 +0000 (0:00:00.034)       0:00:42.942 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify fs label] *********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fs.yml:10
Wednesday 06 July 2022  11:01:50 +0000 (0:00:00.042)       0:00:42.984 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [See whether the device node is present] **********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:4
Wednesday 06 July 2022  11:01:50 +0000 (0:00:00.042)       0:00:43.027 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1657105285.4657035,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "ctime": 1657105283.3717036,
        "dev": 5,
        "device_type": 64768,
        "executable": false,
        "exists": true,
        "gid": 6,
        "gr_name": "disk",
        "inode": 505,
        "isblk": true,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": false,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/symlink",
        "mode": "0660",
        "mtime": 1657105283.3717036,
        "nlink": 1,
        "path": "/dev/mapper/foo-test1",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": null,
        "wgrp": true,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [Verify the presence/absence of the device node] **************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:10
Wednesday 06 July 2022  11:01:50 +0000 (0:00:00.409)       0:00:43.437 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about this volume] *********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:18
Wednesday 06 July 2022  11:01:50 +0000 (0:00:00.040)       0:00:43.478 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [(1/2) Process volume type (set initial value)] ***************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:24
Wednesday 06 July 2022  11:01:50 +0000 (0:00:00.042)       0:00:43.520 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "st_volume_type": "lvm"
    },
    "changed": false
}

TASK [(2/2) Process volume type (get RAID value)] ******************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:28
Wednesday 06 July 2022  11:01:50 +0000 (0:00:00.037)       0:00:43.558 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the volume's device type] *****************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:33
Wednesday 06 July 2022  11:01:50 +0000 (0:00:00.025)       0:00:43.583 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Stat the LUKS device, if encrypted] **************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:3
Wednesday 06 July 2022  11:01:51 +0000 (0:00:00.041)       0:00:43.625 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Ensure cryptsetup is present] ********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:10
Wednesday 06 July 2022  11:01:51 +0000 (0:00:00.026)       0:00:43.651 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [Collect LUKS info for this volume] ***************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:15
Wednesday 06 July 2022  11:01:53 +0000 (0:00:02.007)       0:00:45.658 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the presence/absence of the LUKS device node] *********************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:21
Wednesday 06 July 2022  11:01:53 +0000 (0:00:00.026)       0:00:45.685 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify that the raw device is the same as the device if not encrypted] ***
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:30
Wednesday 06 July 2022  11:01:53 +0000 (0:00:00.023)       0:00:45.708 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about the LUKS volume if encrypted] ****************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:38
Wednesday 06 July 2022  11:01:53 +0000 (0:00:00.054)       0:00:45.762 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the LUKS volume's device type if encrypted] ***********************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:44
Wednesday 06 July 2022  11:01:53 +0000 (0:00:00.024)       0:00:45.786 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS version] ******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:49
Wednesday 06 July 2022  11:01:53 +0000 (0:00:00.024)       0:00:45.811 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS key size] *****************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:55
Wednesday 06 July 2022  11:01:53 +0000 (0:00:00.027)       0:00:45.839 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS cipher] *******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:61
Wednesday 06 July 2022  11:01:53 +0000 (0:00:00.024)       0:00:45.864 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:67
Wednesday 06 July 2022  11:01:53 +0000 (0:00:00.023)       0:00:45.888 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": [],
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:74
Wednesday 06 July 2022  11:01:53 +0000 (0:00:00.049)       0:00:45.937 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:79
Wednesday 06 July 2022  11:01:53 +0000 (0:00:00.049)       0:00:45.986 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:85
Wednesday 06 July 2022  11:01:53 +0000 (0:00:00.043)       0:00:46.030 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:91
Wednesday 06 July 2022  11:01:53 +0000 (0:00:00.037)       0:00:46.068 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:97
Wednesday 06 July 2022  11:01:53 +0000 (0:00:00.036)       0:00:46.105 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_expected_crypttab_entries": null,
        "_storage_test_expected_crypttab_key_file": null
    },
    "changed": false
}

TASK [get information about RAID] **********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:7
Wednesday 06 July 2022  11:01:53 +0000 (0:00:00.038)       0:00:46.144 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:13
Wednesday 06 July 2022  11:01:53 +0000 (0:00:00.039)       0:00:46.183 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:17
Wednesday 06 July 2022  11:01:53 +0000 (0:00:00.038)       0:00:46.221 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:21
Wednesday 06 July 2022  11:01:53 +0000 (0:00:00.036)       0:00:46.258 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:25
Wednesday 06 July 2022  11:01:53 +0000 (0:00:00.041)       0:00:46.299 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:31
Wednesday 06 July 2022  11:01:53 +0000 (0:00:00.038)       0:00:46.338 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:37
Wednesday 06 July 2022  11:01:53 +0000 (0:00:00.037)       0:00:46.375 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the actual size of the volume] *************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:3
Wednesday 06 July 2022  11:01:53 +0000 (0:00:00.046)       0:00:46.422 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "bytes": 3221225472,
    "changed": false,
    "lvm": "3g",
    "parted": "3GiB",
    "size": "3 GiB"
}

TASK [parse the requested size of the volume] **********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:9
Wednesday 06 July 2022  11:01:54 +0000 (0:00:00.410)       0:00:46.832 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "bytes": 3221225472,
    "changed": false,
    "lvm": "3g",
    "parted": "3GiB",
    "size": "3 GiB"
}

TASK [Establish base value for expected size] **********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:15
Wednesday 06 July 2022  11:01:54 +0000 (0:00:00.413)       0:00:47.245 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_expected_size": "3221225472"
    },
    "changed": false
}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:20
Wednesday 06 July 2022  11:01:54 +0000 (0:00:00.057)       0:00:47.303 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_expected_size": "3221225472"
}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:25
Wednesday 06 July 2022  11:01:54 +0000 (0:00:00.040)       0:00:47.344 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:28
Wednesday 06 July 2022  11:01:54 +0000 (0:00:00.040)       0:00:47.384 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Get the size of parent/pool device] **************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:31
Wednesday 06 July 2022  11:01:54 +0000 (0:00:00.045)       0:00:47.430 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:36
Wednesday 06 July 2022  11:01:54 +0000 (0:00:00.041)       0:00:47.471 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Calculate the expected size based on pool size and percentage value] *****
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:39
Wednesday 06 July 2022  11:01:54 +0000 (0:00:00.040)       0:00:47.511 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:44
Wednesday 06 July 2022  11:01:54 +0000 (0:00:00.040)       0:00:47.551 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_actual_size": {
        "bytes": 3221225472,
        "changed": false,
        "failed": false,
        "lvm": "3g",
        "parted": "3GiB",
        "size": "3 GiB"
    }
}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:47
Wednesday 06 July 2022  11:01:54 +0000 (0:00:00.041)       0:00:47.593 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_expected_size": "3221225472"
}

TASK [assert] ******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:50
Wednesday 06 July 2022  11:01:55 +0000 (0:00:00.036)       0:00:47.630 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Get information about the LV] ********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:6
Wednesday 06 July 2022  11:01:55 +0000 (0:00:00.097)       0:00:47.728 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "lvs",
        "--noheadings",
        "--nameprefixes",
        "--units=b",
        "--nosuffix",
        "--unquoted",
        "-o",
        "name,attr,cache_total_blocks,chunk_size,segtype",
        "foo/test1"
    ],
    "delta": "0:00:00.058803",
    "end": "2022-07-06 11:01:55.012174",
    "rc": 0,
    "start": "2022-07-06 11:01:54.953371"
}

STDOUT:

  LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-a----- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:14
Wednesday 06 July 2022  11:01:55 +0000 (0:00:00.620)       0:00:48.348 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_lv_segtype": [
            "linear"
        ]
    },
    "changed": false
}

TASK [check segment type] ******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:17
Wednesday 06 July 2022  11:01:55 +0000 (0:00:00.083)       0:00:48.431 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:22
Wednesday 06 July 2022  11:01:55 +0000 (0:00:00.060)       0:00:48.492 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested cache size] ******************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:26
Wednesday 06 July 2022  11:01:55 +0000 (0:00:00.040)       0:00:48.533 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:32
Wednesday 06 July 2022  11:01:55 +0000 (0:00:00.040)       0:00:48.573 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check cache size] ********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:36
Wednesday 06 July 2022  11:01:55 +0000 (0:00:00.040)       0:00:48.614 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up facts] **********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume.yml:16
Wednesday 06 July 2022  11:01:56 +0000 (0:00:00.041)       0:00:48.655 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": null
    },
    "changed": false
}

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:40
Wednesday 06 July 2022  11:01:56 +0000 (0:00:00.036)       0:00:48.691 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_pool": null
    },
    "changed": false
}

TASK [Verify the volumes with no pool were correctly managed] ******************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:47
Wednesday 06 July 2022  11:01:56 +0000 (0:00:00.037)       0:00:48.729 ******** 

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:57
Wednesday 06 July 2022  11:01:56 +0000 (0:00:00.023)       0:00:48.752 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_blkinfo": null,
        "storage_test_crypttab": null,
        "storage_test_fstab": null,
        "storage_test_volume": null
    },
    "changed": false
}

TASK [Repeat the previous invocation to verify idempotence] ********************
task path: /tmp/tmprua6lrek/tests/tests_remove_mount.yml:55
Wednesday 06 July 2022  11:01:56 +0000 (0:00:00.037)       0:00:48.790 ******** 

TASK [linux-system-roles.storage : set platform/version specific variables] ****
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main.yml:2
Wednesday 06 July 2022  11:01:56 +0000 (0:00:00.050)       0:00:48.840 ******** 
included: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/fedora-35.qcow2.snap

TASK [linux-system-roles.storage : Ensure ansible_facts used by role] **********
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2
Wednesday 06 July 2022  11:01:56 +0000 (0:00:00.037)       0:00:48.878 ******** 
ok: [/cache/fedora-35.qcow2.snap]

TASK [linux-system-roles.storage : Set platform/version specific variables] ****
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8
Wednesday 06 July 2022  11:01:56 +0000 (0:00:00.547)       0:00:49.426 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
ok: [/cache/fedora-35.qcow2.snap] => (item=Fedora.yml) => {
    "ansible_facts": {
        "_storage_copr_packages": [
            {
                "packages": [
                    "vdo",
                    "kmod-vdo"
                ],
                "repository": "rhawalsh/dm-vdo"
            }
        ],
        "_storage_copr_support_packages": [
            "dnf-plugins-core"
        ],
        "blivet_package_list": [
            "python3-blivet",
            "libblockdev-crypto",
            "libblockdev-dm",
            "libblockdev-lvm",
            "libblockdev-mdraid",
            "libblockdev-swap"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/vars/Fedora.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora.yml"
}
skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora_35.yml",
    "skip_reason": "Conditional result was False"
}
skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora_35.yml",
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main.yml:5
Wednesday 06 July 2022  11:01:56 +0000 (0:00:00.064)       0:00:49.490 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main.yml:9
Wednesday 06 July 2022  11:01:56 +0000 (0:00:00.036)       0:00:49.526 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : include the appropriate provider tasks] *****
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main.yml:13
Wednesday 06 July 2022  11:01:56 +0000 (0:00:00.034)       0:00:49.560 ******** 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
included: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/fedora-35.qcow2.snap

TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2
Wednesday 06 July 2022  11:01:56 +0000 (0:00:00.044)       0:00:49.605 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : make sure blivet is available] **************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7
Wednesday 06 July 2022  11:01:57 +0000 (0:00:00.020)       0:00:49.625 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : show storage_pools] *************************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14
Wednesday 06 July 2022  11:01:57 +0000 (0:00:00.029)       0:00:49.654 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_pools": [
        {
            "disks": [
                "sda"
            ],
            "name": "foo",
            "volumes": [
                {
                    "mount_point": "",
                    "name": "test1",
                    "size": "3g"
                }
            ]
        }
    ]
}

TASK [linux-system-roles.storage : show storage_volumes] ***********************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19
Wednesday 06 July 2022  11:01:57 +0000 (0:00:00.038)       0:00:49.693 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined"
}

TASK [linux-system-roles.storage : get required packages] **********************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24
Wednesday 06 July 2022  11:01:57 +0000 (0:00:00.039)       0:00:49.733 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : enable copr repositories if needed] *********
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37
Wednesday 06 July 2022  11:01:57 +0000 (0:00:00.031)       0:00:49.764 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : make sure required packages are installed] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44
Wednesday 06 July 2022  11:01:57 +0000 (0:00:00.029)       0:00:49.794 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : get service facts] **************************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51
Wednesday 06 July 2022  11:01:57 +0000 (0:00:00.029)       0:00:49.824 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58
Wednesday 06 July 2022  11:01:57 +0000 (0:00:00.029)       0:00:49.854 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_cryptsetup_services": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] *******
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71
Wednesday 06 July 2022  11:01:57 +0000 (0:00:00.058)       0:00:49.913 ******** 

TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77
Wednesday 06 July 2022  11:01:57 +0000 (0:00:00.022)       0:00:49.936 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "actions": [],
    "changed": false,
    "crypts": [],
    "leaves": [
        "/dev/sr0",
        "/dev/vda1",
        "/dev/vda2",
        "/dev/vda3",
        "/dev/vda4",
        "/dev/vda5",
        "/dev/mapper/foo-test1",
        "/dev/sdb",
        "/dev/sdc",
        "/dev/nvme0n1",
        "/dev/nvme1n1",
        "/dev/nvme2n1",
        "/dev/vdb1",
        "/dev/vdc1",
        "/dev/vdd",
        "/dev/vde",
        "/dev/vdf",
        "/dev/zram0"
    ],
    "mounts": [],
    "packages": [
        "lvm2",
        "e2fsprogs",
        "btrfs-progs",
        "dosfstools",
        "xfsprogs"
    ],
    "pools": [
        {
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "name": "foo",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "state": "present",
            "type": "lvm",
            "volumes": [
                {
                    "_device": "/dev/mapper/foo-test1",
                    "_kernel_device": "/dev/dm-0",
                    "_mount_id": "/dev/mapper/foo-test1",
                    "_raw_device": "/dev/mapper/foo-test1",
                    "_raw_kernel_device": "/dev/dm-0",
                    "cache_devices": [],
                    "cache_mode": null,
                    "cache_size": 0,
                    "cached": false,
                    "compression": null,
                    "deduplication": null,
                    "disks": [
                        "sda"
                    ],
                    "encryption": false,
                    "encryption_cipher": null,
                    "encryption_key": null,
                    "encryption_key_size": null,
                    "encryption_luks_version": null,
                    "encryption_password": null,
                    "fs_create_options": "",
                    "fs_label": "",
                    "fs_overwrite_existing": true,
                    "fs_type": "xfs",
                    "mount_check": 0,
                    "mount_device_identifier": "uuid",
                    "mount_options": "defaults",
                    "mount_passno": 0,
                    "mount_point": "",
                    "name": "test1",
                    "raid_chunk_size": null,
                    "raid_device_count": null,
                    "raid_disks": [],
                    "raid_level": null,
                    "raid_metadata_version": null,
                    "raid_spare_count": null,
                    "size": "3g",
                    "state": "present",
                    "thin": false,
                    "thin_pool_name": null,
                    "thin_pool_size": null,
                    "type": "lvm",
                    "vdo_pool_size": null
                }
            ]
        }
    ],
    "volumes": []
}

TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91
Wednesday 06 July 2022  11:01:59 +0000 (0:00:02.037)       0:00:51.973 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] *****
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103
Wednesday 06 July 2022  11:01:59 +0000 (0:00:00.039)       0:00:52.012 ******** 

TASK [linux-system-roles.storage : show blivet_output] *************************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109
Wednesday 06 July 2022  11:01:59 +0000 (0:00:00.023)       0:00:52.036 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "blivet_output": {
        "actions": [],
        "changed": false,
        "crypts": [],
        "failed": false,
        "leaves": [
            "/dev/sr0",
            "/dev/vda1",
            "/dev/vda2",
            "/dev/vda3",
            "/dev/vda4",
            "/dev/vda5",
            "/dev/mapper/foo-test1",
            "/dev/sdb",
            "/dev/sdc",
            "/dev/nvme0n1",
            "/dev/nvme1n1",
            "/dev/nvme2n1",
            "/dev/vdb1",
            "/dev/vdc1",
            "/dev/vdd",
            "/dev/vde",
            "/dev/vdf",
            "/dev/zram0"
        ],
        "mounts": [],
        "packages": [
            "lvm2",
            "e2fsprogs",
            "btrfs-progs",
            "dosfstools",
            "xfsprogs"
        ],
        "pools": [
            {
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "name": "foo",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "state": "present",
                "type": "lvm",
                "volumes": [
                    {
                        "_device": "/dev/mapper/foo-test1",
                        "_kernel_device": "/dev/dm-0",
                        "_mount_id": "/dev/mapper/foo-test1",
                        "_raw_device": "/dev/mapper/foo-test1",
                        "_raw_kernel_device": "/dev/dm-0",
                        "cache_devices": [],
                        "cache_mode": null,
                        "cache_size": 0,
                        "cached": false,
                        "compression": null,
                        "deduplication": null,
                        "disks": [
                            "sda"
                        ],
                        "encryption": false,
                        "encryption_cipher": null,
                        "encryption_key": null,
                        "encryption_key_size": null,
                        "encryption_luks_version": null,
                        "encryption_password": null,
                        "fs_create_options": "",
                        "fs_label": "",
                        "fs_overwrite_existing": true,
                        "fs_type": "xfs",
                        "mount_check": 0,
                        "mount_device_identifier": "uuid",
                        "mount_options": "defaults",
                        "mount_passno": 0,
                        "mount_point": "",
                        "name": "test1",
                        "raid_chunk_size": null,
                        "raid_device_count": null,
                        "raid_disks": [],
                        "raid_level": null,
                        "raid_metadata_version": null,
                        "raid_spare_count": null,
                        "size": "3g",
                        "state": "present",
                        "thin": false,
                        "thin_pool_name": null,
                        "thin_pool_size": null,
                        "type": "lvm",
                        "vdo_pool_size": null
                    }
                ]
            }
        ],
        "volumes": []
    }
}

TASK [linux-system-roles.storage : set the list of pools for test verification] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114
Wednesday 06 July 2022  11:01:59 +0000 (0:00:00.041)       0:00:52.077 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": [
            {
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "name": "foo",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "state": "present",
                "type": "lvm",
                "volumes": [
                    {
                        "_device": "/dev/mapper/foo-test1",
                        "_kernel_device": "/dev/dm-0",
                        "_mount_id": "/dev/mapper/foo-test1",
                        "_raw_device": "/dev/mapper/foo-test1",
                        "_raw_kernel_device": "/dev/dm-0",
                        "cache_devices": [],
                        "cache_mode": null,
                        "cache_size": 0,
                        "cached": false,
                        "compression": null,
                        "deduplication": null,
                        "disks": [
                            "sda"
                        ],
                        "encryption": false,
                        "encryption_cipher": null,
                        "encryption_key": null,
                        "encryption_key_size": null,
                        "encryption_luks_version": null,
                        "encryption_password": null,
                        "fs_create_options": "",
                        "fs_label": "",
                        "fs_overwrite_existing": true,
                        "fs_type": "xfs",
                        "mount_check": 0,
                        "mount_device_identifier": "uuid",
                        "mount_options": "defaults",
                        "mount_passno": 0,
                        "mount_point": "",
                        "name": "test1",
                        "raid_chunk_size": null,
                        "raid_device_count": null,
                        "raid_disks": [],
                        "raid_level": null,
                        "raid_metadata_version": null,
                        "raid_spare_count": null,
                        "size": "3g",
                        "state": "present",
                        "thin": false,
                        "thin_pool_name": null,
                        "thin_pool_size": null,
                        "type": "lvm",
                        "vdo_pool_size": null
                    }
                ]
            }
        ]
    },
    "changed": false
}

TASK [linux-system-roles.storage : set the list of volumes for test verification] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118
Wednesday 06 July 2022  11:01:59 +0000 (0:00:00.040)       0:00:52.118 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : remove obsolete mounts] *********************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134
Wednesday 06 July 2022  11:01:59 +0000 (0:00:00.034)       0:00:52.152 ******** 

TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146
Wednesday 06 July 2022  11:01:59 +0000 (0:00:00.037)       0:00:52.190 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : set up new/current mounts] ******************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151
Wednesday 06 July 2022  11:01:59 +0000 (0:00:00.024)       0:00:52.214 ******** 

TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163
Wednesday 06 July 2022  11:01:59 +0000 (0:00:00.041)       0:00:52.256 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171
Wednesday 06 July 2022  11:01:59 +0000 (0:00:00.024)       0:00:52.280 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1657093385.4860332,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709",
        "ctime": 1657005647.423,
        "dev": 31,
        "device_type": 0,
        "executable": false,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 267,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/x-empty",
        "mode": "0600",
        "mtime": 1657005500.596,
        "nlink": 1,
        "path": "/etc/crypttab",
        "pw_name": "root",
        "readable": true,
        "rgrp": false,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": "10",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176
Wednesday 06 July 2022  11:02:00 +0000 (0:00:00.428)       0:00:52.708 ******** 

TASK [linux-system-roles.storage : Update facts] *******************************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198
Wednesday 06 July 2022  11:02:00 +0000 (0:00:00.024)       0:00:52.733 ******** 
ok: [/cache/fedora-35.qcow2.snap]
META: role_complete for /cache/fedora-35.qcow2.snap

TASK [include_tasks] ***********************************************************
task path: /tmp/tmprua6lrek/tests/tests_remove_mount.yml:67
Wednesday 06 July 2022  11:02:01 +0000 (0:00:00.962)       0:00:53.696 ******** 
included: /tmp/tmprua6lrek/tests/verify-role-results.yml for /cache/fedora-35.qcow2.snap

TASK [Print out pool information] **********************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:1
Wednesday 06 July 2022  11:02:01 +0000 (0:00:00.048)       0:00:53.744 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "_storage_pools_list": [
        {
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "name": "foo",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "state": "present",
            "type": "lvm",
            "volumes": [
                {
                    "_device": "/dev/mapper/foo-test1",
                    "_kernel_device": "/dev/dm-0",
                    "_mount_id": "/dev/mapper/foo-test1",
                    "_raw_device": "/dev/mapper/foo-test1",
                    "_raw_kernel_device": "/dev/dm-0",
                    "cache_devices": [],
                    "cache_mode": null,
                    "cache_size": 0,
                    "cached": false,
                    "compression": null,
                    "deduplication": null,
                    "disks": [
                        "sda"
                    ],
                    "encryption": false,
                    "encryption_cipher": null,
                    "encryption_key": null,
                    "encryption_key_size": null,
                    "encryption_luks_version": null,
                    "encryption_password": null,
                    "fs_create_options": "",
                    "fs_label": "",
                    "fs_overwrite_existing": true,
                    "fs_type": "xfs",
                    "mount_check": 0,
                    "mount_device_identifier": "uuid",
                    "mount_options": "defaults",
                    "mount_passno": 0,
                    "mount_point": "",
                    "name": "test1",
                    "raid_chunk_size": null,
                    "raid_device_count": null,
                    "raid_disks": [],
                    "raid_level": null,
                    "raid_metadata_version": null,
                    "raid_spare_count": null,
                    "size": "3g",
                    "state": "present",
                    "thin": false,
                    "thin_pool_name": null,
                    "thin_pool_size": null,
                    "type": "lvm",
                    "vdo_pool_size": null
                }
            ]
        }
    ]
}

TASK [Print out volume information] ********************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:6
Wednesday 06 July 2022  11:02:01 +0000 (0:00:00.086)       0:00:53.831 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Collect info about the volumes.] *****************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:14
Wednesday 06 July 2022  11:02:01 +0000 (0:00:00.071)       0:00:53.903 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "info": {
        "/dev/mapper/foo-test1": {
            "fstype": "xfs",
            "label": "",
            "name": "/dev/mapper/foo-test1",
            "size": "3G",
            "type": "lvm",
            "uuid": "e363aca8-4773-4789-a338-0cadc3e349f5"
        },
        "/dev/nvme0n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme0n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme1n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme1n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme2n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme2n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sda": {
            "fstype": "LVM2_member",
            "label": "",
            "name": "/dev/sda",
            "size": "10G",
            "type": "disk",
            "uuid": "jTr21u-XFvQ-FDKv-U00Y-g4iZ-7hhe-ySdWgB"
        },
        "/dev/sdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdb",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdc",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sr0": {
            "fstype": "iso9660",
            "label": "cidata",
            "name": "/dev/sr0",
            "size": "364K",
            "type": "rom",
            "uuid": "2022-07-06-11-00-54-00"
        },
        "/dev/vda": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda",
            "size": "4G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vda1": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda1",
            "size": "1M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda2": {
            "fstype": "ext4",
            "label": "boot",
            "name": "/dev/vda2",
            "size": "500M",
            "type": "partition",
            "uuid": "5f2f82d0-ae0a-4574-8811-62a31a51a870"
        },
        "/dev/vda3": {
            "fstype": "vfat",
            "label": "",
            "name": "/dev/vda3",
            "size": "100M",
            "type": "partition",
            "uuid": "5B84-6DD7"
        },
        "/dev/vda4": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda4",
            "size": "4M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda5": {
            "fstype": "btrfs",
            "label": "fedora",
            "name": "/dev/vda5",
            "size": "3.4G",
            "type": "partition",
            "uuid": "fbdaf05f-1a41-4dc5-b56e-a10edb430f9a"
        },
        "/dev/vdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdb",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdb1": {
            "fstype": "ext4",
            "label": "yumcache",
            "name": "/dev/vdb1",
            "size": "2G",
            "type": "partition",
            "uuid": "e676dfc5-3e4b-4331-8ede-73c3f56d2cab"
        },
        "/dev/vdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdc",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdc1": {
            "fstype": "ext4",
            "label": "yumvarlib",
            "name": "/dev/vdc1",
            "size": "2G",
            "type": "partition",
            "uuid": "0c299eb4-81f5-4414-b246-b95738eb82f0"
        },
        "/dev/vdd": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdd",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vde": {
            "fstype": "",
            "label": "",
            "name": "/dev/vde",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdf": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdf",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/zram0": {
            "fstype": "",
            "label": "",
            "name": "/dev/zram0",
            "size": "1.9G",
            "type": "disk",
            "uuid": ""
        }
    }
}

TASK [Read the /etc/fstab file for volume existence] ***************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:19
Wednesday 06 July 2022  11:02:01 +0000 (0:00:00.408)       0:00:54.312 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/fstab"
    ],
    "delta": "0:00:00.003380",
    "end": "2022-07-06 11:02:01.440502",
    "rc": 0,
    "start": "2022-07-06 11:02:01.437122"
}

STDOUT:


#
# /etc/fstab
# Created by anaconda on Tue Jul  5 07:18:20 2022
#
# Accessible filesystems, by reference, are maintained under '/dev/disk/'.
# See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info.
#
# After editing this file, run 'systemctl daemon-reload' to update systemd
# units generated from this file.
#
UUID=fbdaf05f-1a41-4dc5-b56e-a10edb430f9a /                       btrfs   subvol=root,compress=zstd:1 0 0
UUID=5f2f82d0-ae0a-4574-8811-62a31a51a870 /boot                   ext4    defaults        1 2
UUID=5B84-6DD7          /boot/efi               vfat    defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2
UUID=fbdaf05f-1a41-4dc5-b56e-a10edb430f9a /home                   btrfs   subvol=home,compress=zstd:1 0 0
/dev/vdb1	/var/cache/dnf	auto	defaults,nofail,comment=cloudconfig	0	2
/dev/vdc1	/var/lib/dnf	auto	defaults,nofail,comment=cloudconfig	0	2

TASK [Read the /etc/crypttab file] *********************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:24
Wednesday 06 July 2022  11:02:02 +0000 (0:00:00.423)       0:00:54.735 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/crypttab"
    ],
    "delta": "0:00:00.003243",
    "end": "2022-07-06 11:02:01.859988",
    "failed_when_result": false,
    "rc": 0,
    "start": "2022-07-06 11:02:01.856745"
}

TASK [Verify the volumes listed in storage_pools were correctly managed] *******
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:33
Wednesday 06 July 2022  11:02:02 +0000 (0:00:00.407)       0:00:55.143 ******** 
[WARNING]: The loop variable 'storage_test_pool' is already in use. You should
set the `loop_var` value in the `loop_control` option for the task to something
else to avoid variable collisions and unexpected behavior.
included: /tmp/tmprua6lrek/tests/test-verify-pool.yml for /cache/fedora-35.qcow2.snap => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}], 'raid_chunk_size': None})

TASK [Set _storage_pool_tests] *************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool.yml:5
Wednesday 06 July 2022  11:02:02 +0000 (0:00:00.062)       0:00:55.205 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pool_tests": [
            "members",
            "volumes"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool.yml:18
Wednesday 06 July 2022  11:02:02 +0000 (0:00:00.035)       0:00:55.241 ******** 
included: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml for /cache/fedora-35.qcow2.snap => (item=members)
included: /tmp/tmprua6lrek/tests/test-verify-pool-volumes.yml for /cache/fedora-35.qcow2.snap => (item=volumes)

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:1
Wednesday 06 July 2022  11:02:02 +0000 (0:00:00.048)       0:00:55.289 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_count": "1",
        "_storage_test_pool_pvs_lvm": [
            "/dev/sda"
        ]
    },
    "changed": false
}

TASK [Get the canonical device path for each member device] ********************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:10
Wednesday 06 July 2022  11:02:02 +0000 (0:00:00.059)       0:00:55.349 ******** 
ok: [/cache/fedora-35.qcow2.snap] => (item=/dev/sda) => {
    "ansible_loop_var": "pv",
    "changed": false,
    "device": "/dev/sda",
    "pv": "/dev/sda"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:19
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.399)       0:00:55.748 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "__pvs_lvm_len": "1"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:23
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.048)       0:00:55.797 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_pool_pvs": [
            "/dev/sda"
        ]
    },
    "changed": false
}

TASK [Verify PV count] *********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:27
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.051)       0:00:55.848 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:34
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.049)       0:00:55.898 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_type": "disk"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:38
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.040)       0:00:55.939 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_type": "disk"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:42
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.049)       0:00:55.989 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check the type of each PV] ***********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:46
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.027)       0:00:56.017 ******** 
ok: [/cache/fedora-35.qcow2.snap] => (item=/dev/sda) => {
    "ansible_loop_var": "pv",
    "changed": false,
    "pv": "/dev/sda"
}

MSG:

All assertions passed

TASK [Check MD RAID] ***********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:56
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.055)       0:00:56.072 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-md.yml for /cache/fedora-35.qcow2.snap

TASK [get information about RAID] **********************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:6
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.044)       0:00:56.116 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:12
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.026)       0:00:56.142 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:16
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.058)       0:00:56.201 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:20
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.024)       0:00:56.226 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:24
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.031)       0:00:56.257 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:30
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.030)       0:00:56.288 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:36
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.026)       0:00:56.314 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:44
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.023)       0:00:56.337 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_md_active_devices_re": null,
        "storage_test_md_metadata_version_re": null,
        "storage_test_md_spare_devices_re": null
    },
    "changed": false
}

TASK [Check LVM RAID] **********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:59
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.036)       0:00:56.373 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-members-lvmraid.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member LVM RAID settings] **********************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-lvmraid.yml:1
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.048)       0:00:56.421 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-member-lvmraid.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [Get information about LVM RAID] ******************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-lvmraid.yml:3
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.045)       0:00:56.467 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is LVM RAID] *******************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-lvmraid.yml:8
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.028)       0:00:56.496 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-lvmraid.yml:12
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.027)       0:00:56.524 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check Thin Pools] ********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:62
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.026)       0:00:56.550 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-members-thin.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member thinpool settings] **********************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-thin.yml:1
Wednesday 06 July 2022  11:02:03 +0000 (0:00:00.043)       0:00:56.594 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-member-thin.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [Get information about thinpool] ******************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-thin.yml:3
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.043)       0:00:56.637 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is in correct thinpool (when thinp name is provided)] ***
task path: /tmp/tmprua6lrek/tests/verify-pool-member-thin.yml:8
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.023)       0:00:56.661 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is in thinpool (when thinp name is not provided)] ******
task path: /tmp/tmprua6lrek/tests/verify-pool-member-thin.yml:13
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.025)       0:00:56.687 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-thin.yml:17
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.026)       0:00:56.714 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check member encryption] *************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:65
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.024)       0:00:56.738 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-members-encryption.yml for /cache/fedora-35.qcow2.snap

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-encryption.yml:4
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.048)       0:00:56.786 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Validate pool member LUKS settings] **************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-encryption.yml:8
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.049)       0:00:56.836 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => (item=/dev/sda)  => {
    "_storage_test_pool_member_path": "/dev/sda",
    "ansible_loop_var": "_storage_test_pool_member_path",
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Validate pool member crypttab entries] ***********************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-encryption.yml:15
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.028)       0:00:56.865 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-member-crypttab.yml for /cache/fedora-35.qcow2.snap => (item=/dev/sda)

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-crypttab.yml:1
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.043)       0:00:56.908 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": []
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-crypttab.yml:6
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.048)       0:00:56.957 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-crypttab.yml:11
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.048)       0:00:57.006 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-crypttab.yml:17
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.037)       0:00:57.043 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-crypttab.yml:23
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.044)       0:00:57.087 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-crypttab.yml:29
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.039)       0:00:57.126 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-encryption.yml:22
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.037)       0:00:57.164 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_crypttab_key_file": null
    },
    "changed": false
}

TASK [Check VDO] ***************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:68
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.040)       0:00:57.204 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-members-vdo.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member VDO settings] ***************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-vdo.yml:1
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.048)       0:00:57.253 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [get information about VDO deduplication] *********************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:3
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.048)       0:00:57.301 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:8
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.024)       0:00:57.326 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:11
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.026)       0:00:57.352 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:16
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.026)       0:00:57.379 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:21
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.084)       0:00:57.463 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:24
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.025)       0:00:57.489 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:29
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.024)       0:00:57.513 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:39
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.025)       0:00:57.538 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_vdo_status": null
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:71
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.037)       0:00:57.576 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "__pvs_lvm_len": null,
        "_storage_test_expected_pv_count": null,
        "_storage_test_expected_pv_type": null,
        "_storage_test_pool_pvs": [],
        "_storage_test_pool_pvs_lvm": []
    },
    "changed": false
}

TASK [verify the volumes] ******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-volumes.yml:3
Wednesday 06 July 2022  11:02:04 +0000 (0:00:00.034)       0:00:57.611 ******** 
[WARNING]: The loop variable 'storage_test_volume' is already in use. You
should set the `loop_var` value in the `loop_control` option for the task to
something else to avoid variable collisions and unexpected behavior.
included: /tmp/tmprua6lrek/tests/test-verify-volume.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume.yml:2
Wednesday 06 July 2022  11:02:05 +0000 (0:00:00.045)       0:00:57.656 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": true,
        "_storage_volume_tests": [
            "mount",
            "fstab",
            "fs",
            "device",
            "encryption",
            "md",
            "size",
            "cache"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume.yml:10
Wednesday 06 July 2022  11:02:05 +0000 (0:00:00.053)       0:00:57.710 ******** 
included: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml for /cache/fedora-35.qcow2.snap => (item=mount)
included: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml for /cache/fedora-35.qcow2.snap => (item=fstab)
included: /tmp/tmprua6lrek/tests/test-verify-volume-fs.yml for /cache/fedora-35.qcow2.snap => (item=fs)
included: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml for /cache/fedora-35.qcow2.snap => (item=device)
included: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml for /cache/fedora-35.qcow2.snap => (item=encryption)
included: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml for /cache/fedora-35.qcow2.snap => (item=md)
included: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml for /cache/fedora-35.qcow2.snap => (item=size)
included: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml for /cache/fedora-35.qcow2.snap => (item=cache)

TASK [Get expected mount device based on device type] **************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:6
Wednesday 06 July 2022  11:02:05 +0000 (0:00:00.079)       0:00:57.790 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_device_path": "/dev/mapper/foo-test1"
    },
    "changed": false
}

TASK [Set some facts] **********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:14
Wednesday 06 July 2022  11:02:05 +0000 (0:00:00.040)       0:00:57.830 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": [],
        "storage_test_mount_expected_match_count": "0",
        "storage_test_mount_point_matches": [],
        "storage_test_swap_expected_matches": "0"
    },
    "changed": false
}

TASK [Verify the current mount state by device] ********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:28
Wednesday 06 July 2022  11:02:05 +0000 (0:00:00.066)       0:00:57.896 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the current mount state by mount point] ***************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:37
Wednesday 06 July 2022  11:02:05 +0000 (0:00:00.034)       0:00:57.930 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the mount fs type] ************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:45
Wednesday 06 July 2022  11:02:05 +0000 (0:00:00.053)       0:00:57.984 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [command] *****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:54
Wednesday 06 July 2022  11:02:05 +0000 (0:00:00.039)       0:00:58.024 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Gather swap info] ********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:58
Wednesday 06 July 2022  11:02:05 +0000 (0:00:00.026)       0:00:58.050 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify swap status] ******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:63
Wednesday 06 July 2022  11:02:05 +0000 (0:00:00.027)       0:00:58.078 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Unset facts] *************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:75
Wednesday 06 July 2022  11:02:05 +0000 (0:00:00.025)       0:00:58.103 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": null,
        "storage_test_mount_expected_match_count": null,
        "storage_test_mount_point_matches": null,
        "storage_test_swap_expected_matches": null,
        "storage_test_swaps": null,
        "storage_test_sys_node": null
    },
    "changed": false
}

TASK [Set some variables for fstab checking] ***********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml:2
Wednesday 06 July 2022  11:02:05 +0000 (0:00:00.037)       0:00:58.141 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": "0",
        "storage_test_fstab_expected_mount_options_matches": "1",
        "storage_test_fstab_expected_mount_point_matches": "0",
        "storage_test_fstab_id_matches": [],
        "storage_test_fstab_mount_options_matches": [],
        "storage_test_fstab_mount_point_matches": []
    },
    "changed": false
}

TASK [Verify that the device identifier appears in /etc/fstab] *****************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml:25
Wednesday 06 July 2022  11:02:05 +0000 (0:00:00.067)       0:00:58.208 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the fstab mount point] ********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml:32
Wednesday 06 July 2022  11:02:05 +0000 (0:00:00.055)       0:00:58.264 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify mount_options] ****************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml:39
Wednesday 06 July 2022  11:02:05 +0000 (0:00:00.051)       0:00:58.316 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up variables] ******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml:49
Wednesday 06 July 2022  11:02:05 +0000 (0:00:00.038)       0:00:58.354 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": null,
        "storage_test_fstab_expected_mount_options_matches": null,
        "storage_test_fstab_expected_mount_point_matches": null,
        "storage_test_fstab_id_matches": null,
        "storage_test_fstab_mount_options_matches": null,
        "storage_test_fstab_mount_point_matches": null
    },
    "changed": false
}

TASK [Verify fs type] **********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fs.yml:4
Wednesday 06 July 2022  11:02:05 +0000 (0:00:00.036)       0:00:58.390 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify fs label] *********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fs.yml:10
Wednesday 06 July 2022  11:02:05 +0000 (0:00:00.040)       0:00:58.430 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [See whether the device node is present] **********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:4
Wednesday 06 July 2022  11:02:05 +0000 (0:00:00.042)       0:00:58.473 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1657105285.4657035,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "ctime": 1657105283.3717036,
        "dev": 5,
        "device_type": 64768,
        "executable": false,
        "exists": true,
        "gid": 6,
        "gr_name": "disk",
        "inode": 505,
        "isblk": true,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": false,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/symlink",
        "mode": "0660",
        "mtime": 1657105283.3717036,
        "nlink": 1,
        "path": "/dev/mapper/foo-test1",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": null,
        "wgrp": true,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [Verify the presence/absence of the device node] **************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:10
Wednesday 06 July 2022  11:02:06 +0000 (0:00:00.414)       0:00:58.888 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about this volume] *********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:18
Wednesday 06 July 2022  11:02:06 +0000 (0:00:00.045)       0:00:58.933 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [(1/2) Process volume type (set initial value)] ***************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:24
Wednesday 06 July 2022  11:02:06 +0000 (0:00:00.043)       0:00:58.976 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "st_volume_type": "lvm"
    },
    "changed": false
}

TASK [(2/2) Process volume type (get RAID value)] ******************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:28
Wednesday 06 July 2022  11:02:06 +0000 (0:00:00.036)       0:00:59.013 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the volume's device type] *****************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:33
Wednesday 06 July 2022  11:02:06 +0000 (0:00:00.025)       0:00:59.039 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Stat the LUKS device, if encrypted] **************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:3
Wednesday 06 July 2022  11:02:06 +0000 (0:00:00.043)       0:00:59.082 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Ensure cryptsetup is present] ********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:10
Wednesday 06 July 2022  11:02:06 +0000 (0:00:00.023)       0:00:59.106 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [Collect LUKS info for this volume] ***************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:15
Wednesday 06 July 2022  11:02:08 +0000 (0:00:01.986)       0:01:01.092 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the presence/absence of the LUKS device node] *********************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:21
Wednesday 06 July 2022  11:02:08 +0000 (0:00:00.025)       0:01:01.118 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify that the raw device is the same as the device if not encrypted] ***
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:30
Wednesday 06 July 2022  11:02:08 +0000 (0:00:00.025)       0:01:01.144 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about the LUKS volume if encrypted] ****************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:38
Wednesday 06 July 2022  11:02:08 +0000 (0:00:00.052)       0:01:01.196 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the LUKS volume's device type if encrypted] ***********************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:44
Wednesday 06 July 2022  11:02:08 +0000 (0:00:00.026)       0:01:01.222 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS version] ******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:49
Wednesday 06 July 2022  11:02:08 +0000 (0:00:00.023)       0:01:01.246 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS key size] *****************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:55
Wednesday 06 July 2022  11:02:08 +0000 (0:00:00.023)       0:01:01.269 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS cipher] *******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:61
Wednesday 06 July 2022  11:02:08 +0000 (0:00:00.023)       0:01:01.292 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:67
Wednesday 06 July 2022  11:02:08 +0000 (0:00:00.024)       0:01:01.316 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": [],
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:74
Wednesday 06 July 2022  11:02:08 +0000 (0:00:00.049)       0:01:01.365 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:79
Wednesday 06 July 2022  11:02:08 +0000 (0:00:00.048)       0:01:01.413 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:85
Wednesday 06 July 2022  11:02:08 +0000 (0:00:00.039)       0:01:01.453 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:91
Wednesday 06 July 2022  11:02:08 +0000 (0:00:00.037)       0:01:01.490 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:97
Wednesday 06 July 2022  11:02:08 +0000 (0:00:00.038)       0:01:01.529 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_expected_crypttab_entries": null,
        "_storage_test_expected_crypttab_key_file": null
    },
    "changed": false
}

TASK [get information about RAID] **********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:7
Wednesday 06 July 2022  11:02:08 +0000 (0:00:00.037)       0:01:01.567 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:13
Wednesday 06 July 2022  11:02:08 +0000 (0:00:00.039)       0:01:01.607 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:17
Wednesday 06 July 2022  11:02:09 +0000 (0:00:00.038)       0:01:01.645 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:21
Wednesday 06 July 2022  11:02:09 +0000 (0:00:00.039)       0:01:01.685 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:25
Wednesday 06 July 2022  11:02:09 +0000 (0:00:00.038)       0:01:01.724 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:31
Wednesday 06 July 2022  11:02:09 +0000 (0:00:00.041)       0:01:01.766 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:37
Wednesday 06 July 2022  11:02:09 +0000 (0:00:00.039)       0:01:01.805 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the actual size of the volume] *************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:3
Wednesday 06 July 2022  11:02:09 +0000 (0:00:00.038)       0:01:01.843 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "bytes": 3221225472,
    "changed": false,
    "lvm": "3g",
    "parted": "3GiB",
    "size": "3 GiB"
}

TASK [parse the requested size of the volume] **********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:9
Wednesday 06 July 2022  11:02:09 +0000 (0:00:00.437)       0:01:02.280 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "bytes": 3221225472,
    "changed": false,
    "lvm": "3g",
    "parted": "3GiB",
    "size": "3 GiB"
}

TASK [Establish base value for expected size] **********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:15
Wednesday 06 July 2022  11:02:10 +0000 (0:00:00.440)       0:01:02.721 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_expected_size": "3221225472"
    },
    "changed": false
}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:20
Wednesday 06 July 2022  11:02:10 +0000 (0:00:00.112)       0:01:02.833 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_expected_size": "3221225472"
}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:25
Wednesday 06 July 2022  11:02:10 +0000 (0:00:00.036)       0:01:02.870 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:28
Wednesday 06 July 2022  11:02:10 +0000 (0:00:00.036)       0:01:02.906 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Get the size of parent/pool device] **************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:31
Wednesday 06 July 2022  11:02:10 +0000 (0:00:00.035)       0:01:02.941 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:36
Wednesday 06 July 2022  11:02:10 +0000 (0:00:00.036)       0:01:02.978 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Calculate the expected size based on pool size and percentage value] *****
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:39
Wednesday 06 July 2022  11:02:10 +0000 (0:00:00.035)       0:01:03.014 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:44
Wednesday 06 July 2022  11:02:10 +0000 (0:00:00.035)       0:01:03.049 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_actual_size": {
        "bytes": 3221225472,
        "changed": false,
        "failed": false,
        "lvm": "3g",
        "parted": "3GiB",
        "size": "3 GiB"
    }
}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:47
Wednesday 06 July 2022  11:02:10 +0000 (0:00:00.035)       0:01:03.084 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_expected_size": "3221225472"
}

TASK [assert] ******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:50
Wednesday 06 July 2022  11:02:10 +0000 (0:00:00.041)       0:01:03.126 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Get information about the LV] ********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:6
Wednesday 06 July 2022  11:02:10 +0000 (0:00:00.052)       0:01:03.178 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "lvs",
        "--noheadings",
        "--nameprefixes",
        "--units=b",
        "--nosuffix",
        "--unquoted",
        "-o",
        "name,attr,cache_total_blocks,chunk_size,segtype",
        "foo/test1"
    ],
    "delta": "0:00:00.043094",
    "end": "2022-07-06 11:02:10.349306",
    "rc": 0,
    "start": "2022-07-06 11:02:10.306212"
}

STDOUT:

  LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-a----- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:14
Wednesday 06 July 2022  11:02:11 +0000 (0:00:00.455)       0:01:03.634 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_lv_segtype": [
            "linear"
        ]
    },
    "changed": false
}

TASK [check segment type] ******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:17
Wednesday 06 July 2022  11:02:11 +0000 (0:00:00.052)       0:01:03.687 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:22
Wednesday 06 July 2022  11:02:11 +0000 (0:00:00.052)       0:01:03.739 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested cache size] ******************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:26
Wednesday 06 July 2022  11:02:11 +0000 (0:00:00.037)       0:01:03.777 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:32
Wednesday 06 July 2022  11:02:11 +0000 (0:00:00.040)       0:01:03.817 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check cache size] ********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:36
Wednesday 06 July 2022  11:02:11 +0000 (0:00:00.044)       0:01:03.862 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up facts] **********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume.yml:16
Wednesday 06 July 2022  11:02:11 +0000 (0:00:00.047)       0:01:03.909 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": null
    },
    "changed": false
}

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:40
Wednesday 06 July 2022  11:02:11 +0000 (0:00:00.043)       0:01:03.953 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_pool": null
    },
    "changed": false
}

TASK [Verify the volumes with no pool were correctly managed] ******************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:47
Wednesday 06 July 2022  11:02:11 +0000 (0:00:00.035)       0:01:03.989 ******** 

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:57
Wednesday 06 July 2022  11:02:11 +0000 (0:00:00.022)       0:01:04.011 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_blkinfo": null,
        "storage_test_crypttab": null,
        "storage_test_fstab": null,
        "storage_test_volume": null
    },
    "changed": false
}

TASK [Clean up] ****************************************************************
task path: /tmp/tmprua6lrek/tests/tests_remove_mount.yml:69
Wednesday 06 July 2022  11:02:11 +0000 (0:00:00.037)       0:01:04.049 ******** 

TASK [linux-system-roles.storage : set platform/version specific variables] ****
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main.yml:2
Wednesday 06 July 2022  11:02:11 +0000 (0:00:00.063)       0:01:04.113 ******** 
included: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/set_vars.yml for /cache/fedora-35.qcow2.snap

TASK [linux-system-roles.storage : Ensure ansible_facts used by role] **********
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:2
Wednesday 06 July 2022  11:02:11 +0000 (0:00:00.037)       0:01:04.150 ******** 
ok: [/cache/fedora-35.qcow2.snap]

TASK [linux-system-roles.storage : Set platform/version specific variables] ****
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/set_vars.yml:8
Wednesday 06 July 2022  11:02:12 +0000 (0:00:00.559)       0:01:04.710 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
ok: [/cache/fedora-35.qcow2.snap] => (item=Fedora.yml) => {
    "ansible_facts": {
        "_storage_copr_packages": [
            {
                "packages": [
                    "vdo",
                    "kmod-vdo"
                ],
                "repository": "rhawalsh/dm-vdo"
            }
        ],
        "_storage_copr_support_packages": [
            "dnf-plugins-core"
        ],
        "blivet_package_list": [
            "python3-blivet",
            "libblockdev-crypto",
            "libblockdev-dm",
            "libblockdev-lvm",
            "libblockdev-mdraid",
            "libblockdev-swap"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/vars/Fedora.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora.yml"
}
skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora_35.yml",
    "skip_reason": "Conditional result was False"
}
skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora_35.yml",
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main.yml:5
Wednesday 06 July 2022  11:02:12 +0000 (0:00:00.109)       0:01:04.819 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main.yml:9
Wednesday 06 July 2022  11:02:12 +0000 (0:00:00.133)       0:01:04.952 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : include the appropriate provider tasks] *****
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main.yml:13
Wednesday 06 July 2022  11:02:12 +0000 (0:00:00.037)       0:01:04.990 ******** 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
included: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml for /cache/fedora-35.qcow2.snap

TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:2
Wednesday 06 July 2022  11:02:12 +0000 (0:00:00.051)       0:01:05.041 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : make sure blivet is available] **************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7
Wednesday 06 July 2022  11:02:12 +0000 (0:00:00.023)       0:01:05.065 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : show storage_pools] *************************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:14
Wednesday 06 July 2022  11:02:12 +0000 (0:00:00.032)       0:01:05.097 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_pools": [
        {
            "disks": [
                "sda"
            ],
            "name": "foo",
            "state": "absent",
            "volumes": [
                {
                    "mount_point": "",
                    "name": "test1",
                    "size": "3g"
                }
            ]
        }
    ]
}

TASK [linux-system-roles.storage : show storage_volumes] ***********************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:19
Wednesday 06 July 2022  11:02:12 +0000 (0:00:00.041)       0:01:05.138 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined"
}

TASK [linux-system-roles.storage : get required packages] **********************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24
Wednesday 06 July 2022  11:02:12 +0000 (0:00:00.042)       0:01:05.181 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : enable copr repositories if needed] *********
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:37
Wednesday 06 July 2022  11:02:12 +0000 (0:00:00.028)       0:01:05.210 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : make sure required packages are installed] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44
Wednesday 06 July 2022  11:02:12 +0000 (0:00:00.030)       0:01:05.240 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : get service facts] **************************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51
Wednesday 06 July 2022  11:02:12 +0000 (0:00:00.028)       0:01:05.269 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : Set storage_cryptsetup_services] ************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:58
Wednesday 06 July 2022  11:02:12 +0000 (0:00:00.028)       0:01:05.298 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_cryptsetup_services": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : Mask the systemd cryptsetup services] *******
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:71
Wednesday 06 July 2022  11:02:12 +0000 (0:00:00.060)       0:01:05.358 ******** 

TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77
Wednesday 06 July 2022  11:02:12 +0000 (0:00:00.022)       0:01:05.380 ******** 
changed: [/cache/fedora-35.qcow2.snap] => {
    "actions": [
        {
            "action": "destroy format",
            "device": "/dev/mapper/foo-test1",
            "fs_type": "xfs"
        },
        {
            "action": "destroy device",
            "device": "/dev/mapper/foo-test1",
            "fs_type": null
        },
        {
            "action": "destroy device",
            "device": "/dev/foo",
            "fs_type": null
        },
        {
            "action": "destroy format",
            "device": "/dev/sda",
            "fs_type": "lvmpv"
        }
    ],
    "changed": true,
    "crypts": [],
    "leaves": [
        "/dev/sr0",
        "/dev/vda1",
        "/dev/vda2",
        "/dev/vda3",
        "/dev/vda4",
        "/dev/vda5",
        "/dev/sda",
        "/dev/sdb",
        "/dev/sdc",
        "/dev/nvme0n1",
        "/dev/nvme1n1",
        "/dev/nvme2n1",
        "/dev/vdb1",
        "/dev/vdc1",
        "/dev/vdd",
        "/dev/vde",
        "/dev/vdf",
        "/dev/zram0"
    ],
    "mounts": [],
    "packages": [
        "dosfstools",
        "e2fsprogs",
        "btrfs-progs"
    ],
    "pools": [
        {
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "name": "foo",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "state": "absent",
            "type": "lvm",
            "volumes": [
                {
                    "_device": "/dev/mapper/foo-test1",
                    "_mount_id": "/dev/mapper/foo-test1",
                    "_raw_device": "/dev/mapper/foo-test1",
                    "cache_devices": [],
                    "cache_mode": null,
                    "cache_size": 0,
                    "cached": false,
                    "compression": null,
                    "deduplication": null,
                    "disks": [
                        "sda"
                    ],
                    "encryption": false,
                    "encryption_cipher": null,
                    "encryption_key": null,
                    "encryption_key_size": null,
                    "encryption_luks_version": null,
                    "encryption_password": null,
                    "fs_create_options": "",
                    "fs_label": "",
                    "fs_overwrite_existing": true,
                    "fs_type": "xfs",
                    "mount_check": 0,
                    "mount_device_identifier": "uuid",
                    "mount_options": "defaults",
                    "mount_passno": 0,
                    "mount_point": "",
                    "name": "test1",
                    "raid_chunk_size": null,
                    "raid_device_count": null,
                    "raid_disks": [],
                    "raid_level": null,
                    "raid_metadata_version": null,
                    "raid_spare_count": null,
                    "size": "3g",
                    "state": "present",
                    "thin": false,
                    "thin_pool_name": null,
                    "thin_pool_size": null,
                    "type": "lvm",
                    "vdo_pool_size": null
                }
            ]
        }
    ],
    "volumes": []
}

TASK [linux-system-roles.storage : Workaround for udev issue on some platforms] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:91
Wednesday 06 July 2022  11:02:15 +0000 (0:00:02.451)       0:01:07.832 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : Unmask the systemd cryptsetup services] *****
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:103
Wednesday 06 July 2022  11:02:15 +0000 (0:00:00.038)       0:01:07.871 ******** 

TASK [linux-system-roles.storage : show blivet_output] *************************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:109
Wednesday 06 July 2022  11:02:15 +0000 (0:00:00.023)       0:01:07.894 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "blivet_output": {
        "actions": [
            {
                "action": "destroy format",
                "device": "/dev/mapper/foo-test1",
                "fs_type": "xfs"
            },
            {
                "action": "destroy device",
                "device": "/dev/mapper/foo-test1",
                "fs_type": null
            },
            {
                "action": "destroy device",
                "device": "/dev/foo",
                "fs_type": null
            },
            {
                "action": "destroy format",
                "device": "/dev/sda",
                "fs_type": "lvmpv"
            }
        ],
        "changed": true,
        "crypts": [],
        "failed": false,
        "leaves": [
            "/dev/sr0",
            "/dev/vda1",
            "/dev/vda2",
            "/dev/vda3",
            "/dev/vda4",
            "/dev/vda5",
            "/dev/sda",
            "/dev/sdb",
            "/dev/sdc",
            "/dev/nvme0n1",
            "/dev/nvme1n1",
            "/dev/nvme2n1",
            "/dev/vdb1",
            "/dev/vdc1",
            "/dev/vdd",
            "/dev/vde",
            "/dev/vdf",
            "/dev/zram0"
        ],
        "mounts": [],
        "packages": [
            "dosfstools",
            "e2fsprogs",
            "btrfs-progs"
        ],
        "pools": [
            {
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "name": "foo",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "state": "absent",
                "type": "lvm",
                "volumes": [
                    {
                        "_device": "/dev/mapper/foo-test1",
                        "_mount_id": "/dev/mapper/foo-test1",
                        "_raw_device": "/dev/mapper/foo-test1",
                        "cache_devices": [],
                        "cache_mode": null,
                        "cache_size": 0,
                        "cached": false,
                        "compression": null,
                        "deduplication": null,
                        "disks": [
                            "sda"
                        ],
                        "encryption": false,
                        "encryption_cipher": null,
                        "encryption_key": null,
                        "encryption_key_size": null,
                        "encryption_luks_version": null,
                        "encryption_password": null,
                        "fs_create_options": "",
                        "fs_label": "",
                        "fs_overwrite_existing": true,
                        "fs_type": "xfs",
                        "mount_check": 0,
                        "mount_device_identifier": "uuid",
                        "mount_options": "defaults",
                        "mount_passno": 0,
                        "mount_point": "",
                        "name": "test1",
                        "raid_chunk_size": null,
                        "raid_device_count": null,
                        "raid_disks": [],
                        "raid_level": null,
                        "raid_metadata_version": null,
                        "raid_spare_count": null,
                        "size": "3g",
                        "state": "present",
                        "thin": false,
                        "thin_pool_name": null,
                        "thin_pool_size": null,
                        "type": "lvm",
                        "vdo_pool_size": null
                    }
                ]
            }
        ],
        "volumes": []
    }
}

TASK [linux-system-roles.storage : set the list of pools for test verification] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:114
Wednesday 06 July 2022  11:02:15 +0000 (0:00:00.043)       0:01:07.937 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": [
            {
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "name": "foo",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "state": "absent",
                "type": "lvm",
                "volumes": [
                    {
                        "_device": "/dev/mapper/foo-test1",
                        "_mount_id": "/dev/mapper/foo-test1",
                        "_raw_device": "/dev/mapper/foo-test1",
                        "cache_devices": [],
                        "cache_mode": null,
                        "cache_size": 0,
                        "cached": false,
                        "compression": null,
                        "deduplication": null,
                        "disks": [
                            "sda"
                        ],
                        "encryption": false,
                        "encryption_cipher": null,
                        "encryption_key": null,
                        "encryption_key_size": null,
                        "encryption_luks_version": null,
                        "encryption_password": null,
                        "fs_create_options": "",
                        "fs_label": "",
                        "fs_overwrite_existing": true,
                        "fs_type": "xfs",
                        "mount_check": 0,
                        "mount_device_identifier": "uuid",
                        "mount_options": "defaults",
                        "mount_passno": 0,
                        "mount_point": "",
                        "name": "test1",
                        "raid_chunk_size": null,
                        "raid_device_count": null,
                        "raid_disks": [],
                        "raid_level": null,
                        "raid_metadata_version": null,
                        "raid_spare_count": null,
                        "size": "3g",
                        "state": "present",
                        "thin": false,
                        "thin_pool_name": null,
                        "thin_pool_size": null,
                        "type": "lvm",
                        "vdo_pool_size": null
                    }
                ]
            }
        ]
    },
    "changed": false
}

TASK [linux-system-roles.storage : set the list of volumes for test verification] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:118
Wednesday 06 July 2022  11:02:15 +0000 (0:00:00.077)       0:01:08.015 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : remove obsolete mounts] *********************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:134
Wednesday 06 July 2022  11:02:15 +0000 (0:00:00.036)       0:01:08.052 ******** 

TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146
Wednesday 06 July 2022  11:02:15 +0000 (0:00:00.038)       0:01:08.090 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : set up new/current mounts] ******************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:151
Wednesday 06 July 2022  11:02:15 +0000 (0:00:00.026)       0:01:08.116 ******** 

TASK [linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163
Wednesday 06 July 2022  11:02:15 +0000 (0:00:00.037)       0:01:08.154 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : retrieve facts for the /etc/crypttab file] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:171
Wednesday 06 July 2022  11:02:15 +0000 (0:00:00.058)       0:01:08.213 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1657093385.4860332,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709",
        "ctime": 1657005647.423,
        "dev": 31,
        "device_type": 0,
        "executable": false,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 267,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/x-empty",
        "mode": "0600",
        "mtime": 1657005500.596,
        "nlink": 1,
        "path": "/etc/crypttab",
        "pw_name": "root",
        "readable": true,
        "rgrp": false,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": "10",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [linux-system-roles.storage : manage /etc/crypttab to account for changes we just made] ***
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:176
Wednesday 06 July 2022  11:02:16 +0000 (0:00:00.419)       0:01:08.632 ******** 

TASK [linux-system-roles.storage : Update facts] *******************************
task path: /tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198
Wednesday 06 July 2022  11:02:16 +0000 (0:00:00.027)       0:01:08.660 ******** 
ok: [/cache/fedora-35.qcow2.snap]
META: role_complete for /cache/fedora-35.qcow2.snap

TASK [include_tasks] ***********************************************************
task path: /tmp/tmprua6lrek/tests/tests_remove_mount.yml:82
Wednesday 06 July 2022  11:02:17 +0000 (0:00:00.957)       0:01:09.617 ******** 
included: /tmp/tmprua6lrek/tests/verify-role-results.yml for /cache/fedora-35.qcow2.snap

TASK [Print out pool information] **********************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:1
Wednesday 06 July 2022  11:02:17 +0000 (0:00:00.061)       0:01:09.679 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "_storage_pools_list": [
        {
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "name": "foo",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "state": "absent",
            "type": "lvm",
            "volumes": [
                {
                    "_device": "/dev/mapper/foo-test1",
                    "_mount_id": "/dev/mapper/foo-test1",
                    "_raw_device": "/dev/mapper/foo-test1",
                    "cache_devices": [],
                    "cache_mode": null,
                    "cache_size": 0,
                    "cached": false,
                    "compression": null,
                    "deduplication": null,
                    "disks": [
                        "sda"
                    ],
                    "encryption": false,
                    "encryption_cipher": null,
                    "encryption_key": null,
                    "encryption_key_size": null,
                    "encryption_luks_version": null,
                    "encryption_password": null,
                    "fs_create_options": "",
                    "fs_label": "",
                    "fs_overwrite_existing": true,
                    "fs_type": "xfs",
                    "mount_check": 0,
                    "mount_device_identifier": "uuid",
                    "mount_options": "defaults",
                    "mount_passno": 0,
                    "mount_point": "",
                    "name": "test1",
                    "raid_chunk_size": null,
                    "raid_device_count": null,
                    "raid_disks": [],
                    "raid_level": null,
                    "raid_metadata_version": null,
                    "raid_spare_count": null,
                    "size": "3g",
                    "state": "present",
                    "thin": false,
                    "thin_pool_name": null,
                    "thin_pool_size": null,
                    "type": "lvm",
                    "vdo_pool_size": null
                }
            ]
        }
    ]
}

TASK [Print out volume information] ********************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:6
Wednesday 06 July 2022  11:02:17 +0000 (0:00:00.055)       0:01:09.735 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Collect info about the volumes.] *****************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:14
Wednesday 06 July 2022  11:02:17 +0000 (0:00:00.040)       0:01:09.775 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "info": {
        "/dev/nvme0n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme0n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme1n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme1n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme2n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme2n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sda": {
            "fstype": "",
            "label": "",
            "name": "/dev/sda",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdb",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdc",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sr0": {
            "fstype": "iso9660",
            "label": "cidata",
            "name": "/dev/sr0",
            "size": "364K",
            "type": "rom",
            "uuid": "2022-07-06-11-00-54-00"
        },
        "/dev/vda": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda",
            "size": "4G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vda1": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda1",
            "size": "1M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda2": {
            "fstype": "ext4",
            "label": "boot",
            "name": "/dev/vda2",
            "size": "500M",
            "type": "partition",
            "uuid": "5f2f82d0-ae0a-4574-8811-62a31a51a870"
        },
        "/dev/vda3": {
            "fstype": "vfat",
            "label": "",
            "name": "/dev/vda3",
            "size": "100M",
            "type": "partition",
            "uuid": "5B84-6DD7"
        },
        "/dev/vda4": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda4",
            "size": "4M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda5": {
            "fstype": "btrfs",
            "label": "fedora",
            "name": "/dev/vda5",
            "size": "3.4G",
            "type": "partition",
            "uuid": "fbdaf05f-1a41-4dc5-b56e-a10edb430f9a"
        },
        "/dev/vdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdb",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdb1": {
            "fstype": "ext4",
            "label": "yumcache",
            "name": "/dev/vdb1",
            "size": "2G",
            "type": "partition",
            "uuid": "e676dfc5-3e4b-4331-8ede-73c3f56d2cab"
        },
        "/dev/vdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdc",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdc1": {
            "fstype": "ext4",
            "label": "yumvarlib",
            "name": "/dev/vdc1",
            "size": "2G",
            "type": "partition",
            "uuid": "0c299eb4-81f5-4414-b246-b95738eb82f0"
        },
        "/dev/vdd": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdd",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vde": {
            "fstype": "",
            "label": "",
            "name": "/dev/vde",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdf": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdf",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/zram0": {
            "fstype": "",
            "label": "",
            "name": "/dev/zram0",
            "size": "1.9G",
            "type": "disk",
            "uuid": ""
        }
    }
}

TASK [Read the /etc/fstab file for volume existence] ***************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:19
Wednesday 06 July 2022  11:02:17 +0000 (0:00:00.424)       0:01:10.200 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/fstab"
    ],
    "delta": "0:00:00.002888",
    "end": "2022-07-06 11:02:17.309299",
    "rc": 0,
    "start": "2022-07-06 11:02:17.306411"
}

STDOUT:


#
# /etc/fstab
# Created by anaconda on Tue Jul  5 07:18:20 2022
#
# Accessible filesystems, by reference, are maintained under '/dev/disk/'.
# See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info.
#
# After editing this file, run 'systemctl daemon-reload' to update systemd
# units generated from this file.
#
UUID=fbdaf05f-1a41-4dc5-b56e-a10edb430f9a /                       btrfs   subvol=root,compress=zstd:1 0 0
UUID=5f2f82d0-ae0a-4574-8811-62a31a51a870 /boot                   ext4    defaults        1 2
UUID=5B84-6DD7          /boot/efi               vfat    defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2
UUID=fbdaf05f-1a41-4dc5-b56e-a10edb430f9a /home                   btrfs   subvol=home,compress=zstd:1 0 0
/dev/vdb1	/var/cache/dnf	auto	defaults,nofail,comment=cloudconfig	0	2
/dev/vdc1	/var/lib/dnf	auto	defaults,nofail,comment=cloudconfig	0	2

TASK [Read the /etc/crypttab file] *********************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:24
Wednesday 06 July 2022  11:02:17 +0000 (0:00:00.389)       0:01:10.589 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/crypttab"
    ],
    "delta": "0:00:00.002826",
    "end": "2022-07-06 11:02:17.701567",
    "failed_when_result": false,
    "rc": 0,
    "start": "2022-07-06 11:02:17.698741"
}

TASK [Verify the volumes listed in storage_pools were correctly managed] *******
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:33
Wednesday 06 July 2022  11:02:18 +0000 (0:00:00.394)       0:01:10.983 ******** 
[WARNING]: The loop variable 'storage_test_pool' is already in use. You should
set the `loop_var` value in the `loop_control` option for the task to something
else to avoid variable collisions and unexpected behavior.
included: /tmp/tmprua6lrek/tests/test-verify-pool.yml for /cache/fedora-35.qcow2.snap => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'state': 'absent', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1'}], 'raid_chunk_size': None})

TASK [Set _storage_pool_tests] *************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool.yml:5
Wednesday 06 July 2022  11:02:18 +0000 (0:00:00.063)       0:01:11.047 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pool_tests": [
            "members",
            "volumes"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool.yml:18
Wednesday 06 July 2022  11:02:18 +0000 (0:00:00.037)       0:01:11.084 ******** 
included: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml for /cache/fedora-35.qcow2.snap => (item=members)
included: /tmp/tmprua6lrek/tests/test-verify-pool-volumes.yml for /cache/fedora-35.qcow2.snap => (item=volumes)

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:1
Wednesday 06 July 2022  11:02:18 +0000 (0:00:00.047)       0:01:11.131 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_count": "0",
        "_storage_test_pool_pvs_lvm": []
    },
    "changed": false
}

TASK [Get the canonical device path for each member device] ********************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:10
Wednesday 06 July 2022  11:02:18 +0000 (0:00:00.099)       0:01:11.230 ******** 

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:19
Wednesday 06 July 2022  11:02:18 +0000 (0:00:00.022)       0:01:11.252 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "__pvs_lvm_len": "0"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:23
Wednesday 06 July 2022  11:02:18 +0000 (0:00:00.092)       0:01:11.345 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_pool_pvs": []
    },
    "changed": false
}

TASK [Verify PV count] *********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:27
Wednesday 06 July 2022  11:02:18 +0000 (0:00:00.123)       0:01:11.468 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:34
Wednesday 06 July 2022  11:02:18 +0000 (0:00:00.052)       0:01:11.521 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_type": "disk"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:38
Wednesday 06 July 2022  11:02:18 +0000 (0:00:00.038)       0:01:11.559 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_type": "disk"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:42
Wednesday 06 July 2022  11:02:18 +0000 (0:00:00.050)       0:01:11.609 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check the type of each PV] ***********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:46
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.026)       0:01:11.636 ******** 

TASK [Check MD RAID] ***********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:56
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.021)       0:01:11.657 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-md.yml for /cache/fedora-35.qcow2.snap

TASK [get information about RAID] **********************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:6
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.042)       0:01:11.700 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:12
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.025)       0:01:11.725 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:16
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.024)       0:01:11.749 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:20
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.024)       0:01:11.773 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:24
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.025)       0:01:11.799 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:30
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.025)       0:01:11.825 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:36
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.023)       0:01:11.848 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-md.yml:44
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.022)       0:01:11.871 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_md_active_devices_re": null,
        "storage_test_md_metadata_version_re": null,
        "storage_test_md_spare_devices_re": null
    },
    "changed": false
}

TASK [Check LVM RAID] **********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:59
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.045)       0:01:11.917 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-members-lvmraid.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member LVM RAID settings] **********************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-lvmraid.yml:1
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.045)       0:01:11.963 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-member-lvmraid.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1'})

TASK [Get information about LVM RAID] ******************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-lvmraid.yml:3
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.045)       0:01:12.008 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is LVM RAID] *******************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-lvmraid.yml:8
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.030)       0:01:12.039 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-lvmraid.yml:12
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.029)       0:01:12.068 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check Thin Pools] ********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:62
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.027)       0:01:12.095 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-members-thin.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member thinpool settings] **********************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-thin.yml:1
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.044)       0:01:12.140 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-member-thin.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1'})

TASK [Get information about thinpool] ******************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-thin.yml:3
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.041)       0:01:12.181 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is in correct thinpool (when thinp name is provided)] ***
task path: /tmp/tmprua6lrek/tests/verify-pool-member-thin.yml:8
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.023)       0:01:12.205 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is in thinpool (when thinp name is not provided)] ******
task path: /tmp/tmprua6lrek/tests/verify-pool-member-thin.yml:13
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.022)       0:01:12.228 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-thin.yml:17
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.023)       0:01:12.251 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check member encryption] *************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:65
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.025)       0:01:12.277 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-members-encryption.yml for /cache/fedora-35.qcow2.snap

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-encryption.yml:4
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.047)       0:01:12.325 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Validate pool member LUKS settings] **************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-encryption.yml:8
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.050)       0:01:12.375 ******** 

TASK [Validate pool member crypttab entries] ***********************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-encryption.yml:15
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.024)       0:01:12.399 ******** 

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-encryption.yml:22
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.023)       0:01:12.422 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_crypttab_key_file": null
    },
    "changed": false
}

TASK [Check VDO] ***************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:68
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.045)       0:01:12.468 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-members-vdo.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member VDO settings] ***************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-members-vdo.yml:1
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.053)       0:01:12.521 ******** 
included: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1'})

TASK [get information about VDO deduplication] *********************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:3
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.047)       0:01:12.569 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:8
Wednesday 06 July 2022  11:02:19 +0000 (0:00:00.026)       0:01:12.595 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:11
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.068)       0:01:12.664 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:16
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.026)       0:01:12.690 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:21
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.026)       0:01:12.717 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:24
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.026)       0:01:12.743 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:29
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.026)       0:01:12.770 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/verify-pool-member-vdo.yml:39
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.026)       0:01:12.796 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_vdo_status": null
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-members.yml:71
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.035)       0:01:12.831 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "__pvs_lvm_len": null,
        "_storage_test_expected_pv_count": null,
        "_storage_test_expected_pv_type": null,
        "_storage_test_pool_pvs": [],
        "_storage_test_pool_pvs_lvm": []
    },
    "changed": false
}

TASK [verify the volumes] ******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-pool-volumes.yml:3
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.038)       0:01:12.870 ******** 
[WARNING]: The loop variable 'storage_test_volume' is already in use. You
should set the `loop_var` value in the `loop_control` option for the task to
something else to avoid variable collisions and unexpected behavior.
included: /tmp/tmprua6lrek/tests/test-verify-volume.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1'})

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume.yml:2
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.044)       0:01:12.915 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": false,
        "_storage_volume_tests": [
            "mount",
            "fstab",
            "fs",
            "device",
            "encryption",
            "md",
            "size",
            "cache"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume.yml:10
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.048)       0:01:12.963 ******** 
included: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml for /cache/fedora-35.qcow2.snap => (item=mount)
included: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml for /cache/fedora-35.qcow2.snap => (item=fstab)
included: /tmp/tmprua6lrek/tests/test-verify-volume-fs.yml for /cache/fedora-35.qcow2.snap => (item=fs)
included: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml for /cache/fedora-35.qcow2.snap => (item=device)
included: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml for /cache/fedora-35.qcow2.snap => (item=encryption)
included: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml for /cache/fedora-35.qcow2.snap => (item=md)
included: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml for /cache/fedora-35.qcow2.snap => (item=size)
included: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml for /cache/fedora-35.qcow2.snap => (item=cache)

TASK [Get expected mount device based on device type] **************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:6
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.080)       0:01:13.043 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_device_path": "/dev/mapper/foo-test1"
    },
    "changed": false
}

TASK [Set some facts] **********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:14
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.044)       0:01:13.088 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": [],
        "storage_test_mount_expected_match_count": "0",
        "storage_test_mount_point_matches": [],
        "storage_test_swap_expected_matches": "0"
    },
    "changed": false
}

TASK [Verify the current mount state by device] ********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:28
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.057)       0:01:13.146 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the current mount state by mount point] ***************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:37
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.024)       0:01:13.170 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the mount fs type] ************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:45
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.048)       0:01:13.219 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [command] *****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:54
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.041)       0:01:13.260 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Gather swap info] ********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:58
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.023)       0:01:13.283 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify swap status] ******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:63
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.022)       0:01:13.305 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Unset facts] *************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-mount.yml:75
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.021)       0:01:13.327 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": null,
        "storage_test_mount_expected_match_count": null,
        "storage_test_mount_point_matches": null,
        "storage_test_swap_expected_matches": null,
        "storage_test_swaps": null,
        "storage_test_sys_node": null
    },
    "changed": false
}

TASK [Set some variables for fstab checking] ***********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml:2
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.038)       0:01:13.366 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": "0",
        "storage_test_fstab_expected_mount_options_matches": "0",
        "storage_test_fstab_expected_mount_point_matches": "0",
        "storage_test_fstab_id_matches": [],
        "storage_test_fstab_mount_options_matches": [],
        "storage_test_fstab_mount_point_matches": []
    },
    "changed": false
}

TASK [Verify that the device identifier appears in /etc/fstab] *****************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml:25
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.058)       0:01:13.424 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the fstab mount point] ********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml:32
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.022)       0:01:13.447 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify mount_options] ****************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml:39
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.050)       0:01:13.498 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up variables] ******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fstab.yml:49
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.034)       0:01:13.532 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": null,
        "storage_test_fstab_expected_mount_options_matches": null,
        "storage_test_fstab_expected_mount_point_matches": null,
        "storage_test_fstab_id_matches": null,
        "storage_test_fstab_mount_options_matches": null,
        "storage_test_fstab_mount_point_matches": null
    },
    "changed": false
}

TASK [Verify fs type] **********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fs.yml:4
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.035)       0:01:13.567 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify fs label] *********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-fs.yml:10
Wednesday 06 July 2022  11:02:20 +0000 (0:00:00.024)       0:01:13.592 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [See whether the device node is present] **********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:4
Wednesday 06 July 2022  11:02:21 +0000 (0:00:00.025)       0:01:13.618 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "stat": {
        "exists": false
    }
}

TASK [Verify the presence/absence of the device node] **************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:10
Wednesday 06 July 2022  11:02:21 +0000 (0:00:00.386)       0:01:14.005 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about this volume] *********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:18
Wednesday 06 July 2022  11:02:21 +0000 (0:00:00.038)       0:01:14.043 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [(1/2) Process volume type (set initial value)] ***************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:24
Wednesday 06 July 2022  11:02:21 +0000 (0:00:00.026)       0:01:14.070 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "st_volume_type": "lvm"
    },
    "changed": false
}

TASK [(2/2) Process volume type (get RAID value)] ******************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:28
Wednesday 06 July 2022  11:02:21 +0000 (0:00:00.039)       0:01:14.109 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the volume's device type] *****************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-device.yml:33
Wednesday 06 July 2022  11:02:21 +0000 (0:00:00.024)       0:01:14.134 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Stat the LUKS device, if encrypted] **************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:3
Wednesday 06 July 2022  11:02:21 +0000 (0:00:00.023)       0:01:14.157 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Ensure cryptsetup is present] ********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:10
Wednesday 06 July 2022  11:02:21 +0000 (0:00:00.024)       0:01:14.182 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [Collect LUKS info for this volume] ***************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:15
Wednesday 06 July 2022  11:02:23 +0000 (0:00:02.152)       0:01:16.334 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the presence/absence of the LUKS device node] *********************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:21
Wednesday 06 July 2022  11:02:23 +0000 (0:00:00.024)       0:01:16.358 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify that the raw device is the same as the device if not encrypted] ***
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:30
Wednesday 06 July 2022  11:02:23 +0000 (0:00:00.025)       0:01:16.383 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Make sure we got info about the LUKS volume if encrypted] ****************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:38
Wednesday 06 July 2022  11:02:23 +0000 (0:00:00.025)       0:01:16.409 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the LUKS volume's device type if encrypted] ***********************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:44
Wednesday 06 July 2022  11:02:23 +0000 (0:00:00.028)       0:01:16.437 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS version] ******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:49
Wednesday 06 July 2022  11:02:23 +0000 (0:00:00.024)       0:01:16.462 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS key size] *****************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:55
Wednesday 06 July 2022  11:02:23 +0000 (0:00:00.025)       0:01:16.488 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS cipher] *******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:61
Wednesday 06 July 2022  11:02:23 +0000 (0:00:00.024)       0:01:16.513 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:67
Wednesday 06 July 2022  11:02:23 +0000 (0:00:00.025)       0:01:16.538 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": [],
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:74
Wednesday 06 July 2022  11:02:23 +0000 (0:00:00.059)       0:01:16.598 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:79
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.060)       0:01:16.658 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:85
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.039)       0:01:16.698 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:91
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.040)       0:01:16.738 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:97
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.039)       0:01:16.777 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_expected_crypttab_entries": null,
        "_storage_test_expected_crypttab_key_file": null
    },
    "changed": false
}

TASK [get information about RAID] **********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:7
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.038)       0:01:16.816 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:13
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.038)       0:01:16.854 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:17
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.049)       0:01:16.904 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:21
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.041)       0:01:16.945 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:25
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.037)       0:01:16.983 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:31
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.040)       0:01:17.023 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-md.yml:37
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.041)       0:01:17.065 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the actual size of the volume] *************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:3
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.038)       0:01:17.103 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested size of the volume] **********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:9
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.026)       0:01:17.129 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Establish base value for expected size] **********************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:15
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.036)       0:01:17.166 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:20
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.038)       0:01:17.205 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_expected_size": "3221225472"
}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:25
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.033)       0:01:17.239 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:28
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.035)       0:01:17.275 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Get the size of parent/pool device] **************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:31
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.038)       0:01:17.313 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:36
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.042)       0:01:17.355 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Calculate the expected size based on pool size and percentage value] *****
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:39
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.040)       0:01:17.396 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:44
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.043)       0:01:17.439 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_actual_size": {
        "changed": false,
        "skip_reason": "Conditional result was False",
        "skipped": true
    }
}

TASK [debug] *******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:47
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.038)       0:01:17.478 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_expected_size": "3221225472"
}

TASK [assert] ******************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-size.yml:50
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.040)       0:01:17.518 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Get information about the LV] ********************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:6
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.025)       0:01:17.543 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:14
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.024)       0:01:17.568 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check segment type] ******************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:17
Wednesday 06 July 2022  11:02:24 +0000 (0:00:00.025)       0:01:17.593 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:22
Wednesday 06 July 2022  11:02:25 +0000 (0:00:00.027)       0:01:17.621 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested cache size] ******************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:26
Wednesday 06 July 2022  11:02:25 +0000 (0:00:00.026)       0:01:17.647 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:32
Wednesday 06 July 2022  11:02:25 +0000 (0:00:00.024)       0:01:17.672 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check cache size] ********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume-cache.yml:36
Wednesday 06 July 2022  11:02:25 +0000 (0:00:00.023)       0:01:17.695 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up facts] **********************************************************
task path: /tmp/tmprua6lrek/tests/test-verify-volume.yml:16
Wednesday 06 July 2022  11:02:25 +0000 (0:00:00.024)       0:01:17.719 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": null
    },
    "changed": false
}

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:40
Wednesday 06 July 2022  11:02:25 +0000 (0:00:00.040)       0:01:17.760 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_pool": null
    },
    "changed": false
}

TASK [Verify the volumes with no pool were correctly managed] ******************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:47
Wednesday 06 July 2022  11:02:25 +0000 (0:00:00.083)       0:01:17.843 ******** 

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmprua6lrek/tests/verify-role-results.yml:57
Wednesday 06 July 2022  11:02:25 +0000 (0:00:00.062)       0:01:17.906 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_blkinfo": null,
        "storage_test_crypttab": null,
        "storage_test_fstab": null,
        "storage_test_volume": null
    },
    "changed": false
}
META: ran handlers
META: ran handlers

PLAY RECAP *********************************************************************
/cache/fedora-35.qcow2.snap : ok=383  changed=4    unreachable=0    failed=0    skipped=322  rescued=0    ignored=0   

Wednesday 06 July 2022  11:02:25 +0000 (0:00:00.048)       0:01:17.954 ******** 
=============================================================================== 
linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.60s
/tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 
linux-system-roles.storage : make sure blivet is available -------------- 2.54s
/tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:7 
linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.45s
/tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 
Ensure cryptsetup is present -------------------------------------------- 2.15s
/tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:10 -------------------
linux-system-roles.storage : make sure required packages are installed --- 2.07s
/tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:44 
linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.04s
/tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 
Ensure cryptsetup is present -------------------------------------------- 2.01s
/tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:10 -------------------
linux-system-roles.storage : manage the pools and volumes to match the specified state --- 2.00s
/tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:77 
linux-system-roles.storage : get service facts -------------------------- 1.99s
/tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:51 
Ensure cryptsetup is present -------------------------------------------- 1.99s
/tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:10 -------------------
Ensure cryptsetup is present -------------------------------------------- 1.97s
/tmp/tmprua6lrek/tests/test-verify-volume-encryption.yml:10 -------------------
Gathering Facts --------------------------------------------------------- 1.34s
/tmp/tmprua6lrek/tests/tests_remove_mount.yml:2 -------------------------------
linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 1.00s
/tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:146 
linux-system-roles.storage : Update facts ------------------------------- 0.96s
/tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 
linux-system-roles.storage : Update facts ------------------------------- 0.96s
/tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 
linux-system-roles.storage : Update facts ------------------------------- 0.96s
/tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 
linux-system-roles.storage : Update facts ------------------------------- 0.95s
/tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 
linux-system-roles.storage : Update facts ------------------------------- 0.95s
/tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:198 
linux-system-roles.storage : get required packages ---------------------- 0.77s
/tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:24 
linux-system-roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.74s
/tmp/tmprua6lrek/tests/roles/linux-system-roles.storage/tasks/main-blivet.yml:163 
ansible-playbook [core 2.12.6]
  config file = /etc/ansible/ansible.cfg
  configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/lib/python3.9/site-packages/ansible
  ansible collection location = /tmp/tmpfdufgi2k
  executable location = /usr/bin/ansible-playbook
  python version = 3.9.13 (main, May 18 2022, 00:00:00) [GCC 11.3.1 20220421 (Red Hat 11.3.1-2)]
  jinja version = 2.11.3
  libyaml = True
Using /etc/ansible/ansible.cfg as config file
Skipping callback 'debug', as we already have a stdout callback.
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.

PLAYBOOK: tests_remove_mount.yml ***********************************************
1 plays in /tmp/tmpus9dv81c/tests/storage/tests_remove_mount.yml

PLAY [all] *********************************************************************

TASK [Gathering Facts] *********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/tests_remove_mount.yml:2
Wednesday 06 July 2022  14:53:20 +0000 (0:00:00.014)       0:00:00.014 ******** 
ok: [/cache/fedora-35.qcow2.snap]
META: ran handlers

TASK [include_role : fedora.linux_system_roles.storage] ************************
task path: /tmp/tmpus9dv81c/tests/storage/tests_remove_mount.yml:12
Wednesday 06 July 2022  14:53:21 +0000 (0:00:01.279)       0:00:01.293 ******** 

TASK [fedora.linux_system_roles.storage : set platform/version specific variables] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2
Wednesday 06 July 2022  14:53:21 +0000 (0:00:00.038)       0:00:01.331 ******** 
included: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/fedora-35.qcow2.snap

TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2
Wednesday 06 July 2022  14:53:21 +0000 (0:00:00.029)       0:00:01.361 ******** 
ok: [/cache/fedora-35.qcow2.snap]

TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7
Wednesday 06 July 2022  14:53:22 +0000 (0:00:00.551)       0:00:01.912 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
ok: [/cache/fedora-35.qcow2.snap] => (item=Fedora.yml) => {
    "ansible_facts": {
        "_storage_copr_packages": [
            {
                "packages": [
                    "vdo",
                    "kmod-vdo"
                ],
                "repository": "rhawalsh/dm-vdo"
            }
        ],
        "_storage_copr_support_packages": [
            "dnf-plugins-core"
        ],
        "blivet_package_list": [
            "python3-blivet",
            "libblockdev-crypto",
            "libblockdev-dm",
            "libblockdev-lvm",
            "libblockdev-mdraid",
            "libblockdev-swap"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora.yml"
}
skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora_35.yml",
    "skip_reason": "Conditional result was False"
}
skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora_35.yml",
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5
Wednesday 06 July 2022  14:53:22 +0000 (0:00:00.068)       0:00:01.980 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9
Wednesday 06 July 2022  14:53:22 +0000 (0:00:00.031)       0:00:02.012 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13
Wednesday 06 July 2022  14:53:22 +0000 (0:00:00.029)       0:00:02.041 ******** 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
included: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/fedora-35.qcow2.snap

TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2
Wednesday 06 July 2022  14:53:22 +0000 (0:00:00.057)       0:00:02.098 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : make sure blivet is available] *******
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7
Wednesday 06 July 2022  14:53:22 +0000 (0:00:00.019)       0:00:02.118 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [fedora.linux_system_roles.storage : show storage_pools] ******************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13
Wednesday 06 July 2022  14:53:24 +0000 (0:00:02.656)       0:00:04.774 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined"
}

TASK [fedora.linux_system_roles.storage : show storage_volumes] ****************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18
Wednesday 06 July 2022  14:53:25 +0000 (0:00:00.035)       0:00:04.809 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined"
}

TASK [fedora.linux_system_roles.storage : get required packages] ***************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23
Wednesday 06 July 2022  14:53:25 +0000 (0:00:00.030)       0:00:04.840 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "actions": [],
    "changed": false,
    "crypts": [],
    "leaves": [],
    "mounts": [],
    "packages": [],
    "pools": [],
    "volumes": []
}

TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35
Wednesday 06 July 2022  14:53:25 +0000 (0:00:00.730)       0:00:05.571 ******** 
included: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for /cache/fedora-35.qcow2.snap

TASK [fedora.linux_system_roles.storage : check if the COPR support packages should be installed] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2
Wednesday 06 July 2022  14:53:25 +0000 (0:00:00.046)       0:00:05.617 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']})  => {
    "ansible_loop_var": "repo",
    "changed": false,
    "repo": {
        "packages": [
            "vdo",
            "kmod-vdo"
        ],
        "repository": "rhawalsh/dm-vdo"
    },
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : make sure COPR support packages are present] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13
Wednesday 06 July 2022  14:53:25 +0000 (0:00:00.044)       0:00:05.661 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : enable COPRs] ************************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:18
Wednesday 06 July 2022  14:53:25 +0000 (0:00:00.034)       0:00:05.695 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => (item={'repository': 'rhawalsh/dm-vdo', 'packages': ['vdo', 'kmod-vdo']})  => {
    "ansible_loop_var": "repo",
    "changed": false,
    "repo": {
        "packages": [
            "vdo",
            "kmod-vdo"
        ],
        "repository": "rhawalsh/dm-vdo"
    },
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : make sure required packages are installed] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41
Wednesday 06 July 2022  14:53:25 +0000 (0:00:00.042)       0:00:05.738 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [fedora.linux_system_roles.storage : get service facts] *******************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47
Wednesday 06 July 2022  14:53:27 +0000 (0:00:01.879)       0:00:07.618 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "services": {
            "NetworkManager-dispatcher.service": {
                "name": "NetworkManager-dispatcher.service",
                "source": "systemd",
                "state": "inactive",
                "status": "enabled"
            },
            "NetworkManager-wait-online.service": {
                "name": "NetworkManager-wait-online.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "NetworkManager.service": {
                "name": "NetworkManager.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "arp-ethers.service": {
                "name": "arp-ethers.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "auditd.service": {
                "name": "auditd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "autovt@.service": {
                "name": "autovt@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "alias"
            },
            "blivet.service": {
                "name": "blivet.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "blk-availability.service": {
                "name": "blk-availability.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "bluetooth.service": {
                "name": "bluetooth.service",
                "source": "systemd",
                "state": "inactive",
                "status": "enabled"
            },
            "chrony-wait.service": {
                "name": "chrony-wait.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "chronyd.service": {
                "name": "chronyd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "cloud-config.service": {
                "name": "cloud-config.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-final.service": {
                "name": "cloud-final.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-init-local.service": {
                "name": "cloud-init-local.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "cloud-init.service": {
                "name": "cloud-init.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "console-getty.service": {
                "name": "console-getty.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "console-login-helper-messages-gensnippet-os-release.service": {
                "name": "console-login-helper-messages-gensnippet-os-release.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "console-login-helper-messages-gensnippet-ssh-keys.service": {
                "name": "console-login-helper-messages-gensnippet-ssh-keys.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "container-getty@.service": {
                "name": "container-getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "dbus-broker.service": {
                "name": "dbus-broker.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "dbus-org.bluez.service": {
                "name": "dbus-org.bluez.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.hostname1.service": {
                "name": "dbus-org.freedesktop.hostname1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.locale1.service": {
                "name": "dbus-org.freedesktop.locale1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.login1.service": {
                "name": "dbus-org.freedesktop.login1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.nm-dispatcher.service": {
                "name": "dbus-org.freedesktop.nm-dispatcher.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.oom1.service": {
                "name": "dbus-org.freedesktop.oom1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.portable1.service": {
                "name": "dbus-org.freedesktop.portable1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus-org.freedesktop.resolve1.service": {
                "name": "dbus-org.freedesktop.resolve1.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "dbus-org.freedesktop.timedate1.service": {
                "name": "dbus-org.freedesktop.timedate1.service",
                "source": "systemd",
                "state": "inactive",
                "status": "alias"
            },
            "dbus.service": {
                "name": "dbus.service",
                "source": "systemd",
                "state": "active",
                "status": "alias"
            },
            "debug-shell.service": {
                "name": "debug-shell.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "display-manager.service": {
                "name": "display-manager.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "dm-event.service": {
                "name": "dm-event.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dmraid-activation.service": {
                "name": "dmraid-activation.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "dnf-makecache.service": {
                "name": "dnf-makecache.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-cmdline.service": {
                "name": "dracut-cmdline.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-initqueue.service": {
                "name": "dracut-initqueue.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-mount.service": {
                "name": "dracut-mount.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-mount.service": {
                "name": "dracut-pre-mount.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-pivot.service": {
                "name": "dracut-pre-pivot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-trigger.service": {
                "name": "dracut-pre-trigger.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-pre-udev.service": {
                "name": "dracut-pre-udev.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "dracut-shutdown.service": {
                "name": "dracut-shutdown.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "emergency.service": {
                "name": "emergency.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "fcoe.service": {
                "name": "fcoe.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "fstrim.service": {
                "name": "fstrim.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "fwupd-offline-update.service": {
                "name": "fwupd-offline-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "fwupd-refresh.service": {
                "name": "fwupd-refresh.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "fwupd.service": {
                "name": "fwupd.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "getty@.service": {
                "name": "getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "enabled"
            },
            "getty@tty1.service": {
                "name": "getty@tty1.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "grub-boot-indeterminate.service": {
                "name": "grub-boot-indeterminate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "grub2-systemd-integration.service": {
                "name": "grub2-systemd-integration.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "hv_kvp_daemon.service": {
                "name": "hv_kvp_daemon.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "initrd-cleanup.service": {
                "name": "initrd-cleanup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-parse-etc.service": {
                "name": "initrd-parse-etc.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-switch-root.service": {
                "name": "initrd-switch-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "initrd-udevadm-cleanup-db.service": {
                "name": "initrd-udevadm-cleanup-db.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "iscsi-shutdown.service": {
                "name": "iscsi-shutdown.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "iscsi.service": {
                "name": "iscsi.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "iscsid.service": {
                "name": "iscsid.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "kmod-static-nodes.service": {
                "name": "kmod-static-nodes.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "ldconfig.service": {
                "name": "ldconfig.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "lvm2-activation-early.service": {
                "name": "lvm2-activation-early.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "lvm2-activation.service": {
                "name": "lvm2-activation.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "lvm2-lvmpolld.service": {
                "name": "lvm2-lvmpolld.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "lvm2-monitor.service": {
                "name": "lvm2-monitor.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "lvm2-pvscan@.service": {
                "name": "lvm2-pvscan@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "man-db-cache-update.service": {
                "name": "man-db-cache-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "man-db-restart-cache-update.service": {
                "name": "man-db-restart-cache-update.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "mdadm-grow-continue@.service": {
                "name": "mdadm-grow-continue@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "mdadm-last-resort@.service": {
                "name": "mdadm-last-resort@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "mdcheck_continue.service": {
                "name": "mdcheck_continue.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "mdcheck_start.service": {
                "name": "mdcheck_start.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "mdmon@.service": {
                "name": "mdmon@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "mdmonitor-oneshot.service": {
                "name": "mdmonitor-oneshot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "mdmonitor.service": {
                "name": "mdmonitor.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "modprobe@.service": {
                "name": "modprobe@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "modprobe@configfs.service": {
                "name": "modprobe@configfs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@drm.service": {
                "name": "modprobe@drm.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "modprobe@fuse.service": {
                "name": "modprobe@fuse.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "multipathd.service": {
                "name": "multipathd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "ndctl-monitor.service": {
                "name": "ndctl-monitor.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "network.service": {
                "name": "network.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "nis-domainname.service": {
                "name": "nis-domainname.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "ntpd.service": {
                "name": "ntpd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "ntpdate.service": {
                "name": "ntpdate.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "pam_namespace.service": {
                "name": "pam_namespace.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "pcscd.service": {
                "name": "pcscd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "indirect"
            },
            "plymouth-quit-wait.service": {
                "name": "plymouth-quit-wait.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "plymouth-start.service": {
                "name": "plymouth-start.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "polkit.service": {
                "name": "polkit.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "qemu-guest-agent.service": {
                "name": "qemu-guest-agent.service",
                "source": "systemd",
                "state": "inactive",
                "status": "enabled"
            },
            "quotaon.service": {
                "name": "quotaon.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "raid-check.service": {
                "name": "raid-check.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rbdmap.service": {
                "name": "rbdmap.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "rc-local.service": {
                "name": "rc-local.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rdisc.service": {
                "name": "rdisc.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "rescue.service": {
                "name": "rescue.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "rpmdb-rebuild.service": {
                "name": "rpmdb-rebuild.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "selinux-autorelabel-mark.service": {
                "name": "selinux-autorelabel-mark.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "selinux-autorelabel.service": {
                "name": "selinux-autorelabel.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "selinux-check-proper-disable.service": {
                "name": "selinux-check-proper-disable.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "serial-getty@.service": {
                "name": "serial-getty@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "indirect"
            },
            "serial-getty@ttyS0.service": {
                "name": "serial-getty@ttyS0.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            },
            "snapd.seeded.service": {
                "name": "snapd.seeded.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "sntp.service": {
                "name": "sntp.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "sshd-keygen.service": {
                "name": "sshd-keygen.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "sshd-keygen@.service": {
                "name": "sshd-keygen@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "disabled"
            },
            "sshd-keygen@ecdsa.service": {
                "name": "sshd-keygen@ecdsa.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-keygen@ed25519.service": {
                "name": "sshd-keygen@ed25519.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd-keygen@rsa.service": {
                "name": "sshd-keygen@rsa.service",
                "source": "systemd",
                "state": "stopped",
                "status": "inactive"
            },
            "sshd.service": {
                "name": "sshd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "sshd@.service": {
                "name": "sshd@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "sssd-autofs.service": {
                "name": "sssd-autofs.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-kcm.service": {
                "name": "sssd-kcm.service",
                "source": "systemd",
                "state": "stopped",
                "status": "indirect"
            },
            "sssd-nss.service": {
                "name": "sssd-nss.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-pac.service": {
                "name": "sssd-pac.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-pam.service": {
                "name": "sssd-pam.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-ssh.service": {
                "name": "sssd-ssh.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd-sudo.service": {
                "name": "sssd-sudo.service",
                "source": "systemd",
                "state": "inactive",
                "status": "indirect"
            },
            "sssd.service": {
                "name": "sssd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "syslog.service": {
                "name": "syslog.service",
                "source": "systemd",
                "state": "stopped",
                "status": "not-found"
            },
            "system-update-cleanup.service": {
                "name": "system-update-cleanup.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-ask-password-console.service": {
                "name": "systemd-ask-password-console.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-ask-password-wall.service": {
                "name": "systemd-ask-password-wall.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-backlight@.service": {
                "name": "systemd-backlight@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-binfmt.service": {
                "name": "systemd-binfmt.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-bless-boot.service": {
                "name": "systemd-bless-boot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-boot-check-no-failures.service": {
                "name": "systemd-boot-check-no-failures.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-boot-system-token.service": {
                "name": "systemd-boot-system-token.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-coredump@.service": {
                "name": "systemd-coredump@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-exit.service": {
                "name": "systemd-exit.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-firstboot.service": {
                "name": "systemd-firstboot.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-fsck-root.service": {
                "name": "systemd-fsck-root.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-fsck@.service": {
                "name": "systemd-fsck@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-fsck@dev-disk-by\\x2duuid-5B84\\x2d6DD7.service": {
                "name": "systemd-fsck@dev-disk-by\\x2duuid-5B84\\x2d6DD7.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "systemd-fsck@dev-disk-by\\x2duuid-5f2f82d0\\x2dae0a\\x2d4574\\x2d8811\\x2d62a31a51a870.service": {
                "name": "systemd-fsck@dev-disk-by\\x2duuid-5f2f82d0\\x2dae0a\\x2d4574\\x2d8811\\x2d62a31a51a870.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "systemd-fsck@dev-vdb1.service": {
                "name": "systemd-fsck@dev-vdb1.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "systemd-fsck@dev-vdc1.service": {
                "name": "systemd-fsck@dev-vdc1.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "systemd-halt.service": {
                "name": "systemd-halt.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-hibernate-resume@.service": {
                "name": "systemd-hibernate-resume@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-hibernate.service": {
                "name": "systemd-hibernate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-homed-activate.service": {
                "name": "systemd-homed-activate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-homed.service": {
                "name": "systemd-homed.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-hostnamed.service": {
                "name": "systemd-hostnamed.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-hwdb-update.service": {
                "name": "systemd-hwdb-update.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-hybrid-sleep.service": {
                "name": "systemd-hybrid-sleep.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-initctl.service": {
                "name": "systemd-initctl.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journal-catalog-update.service": {
                "name": "systemd-journal-catalog-update.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journal-flush.service": {
                "name": "systemd-journal-flush.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-journald.service": {
                "name": "systemd-journald.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-journald@.service": {
                "name": "systemd-journald@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-kexec.service": {
                "name": "systemd-kexec.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-localed.service": {
                "name": "systemd-localed.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-logind.service": {
                "name": "systemd-logind.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-machine-id-commit.service": {
                "name": "systemd-machine-id-commit.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-modules-load.service": {
                "name": "systemd-modules-load.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-network-generator.service": {
                "name": "systemd-network-generator.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "systemd-networkd-wait-online.service": {
                "name": "systemd-networkd-wait-online.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "systemd-networkd.service": {
                "name": "systemd-networkd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "systemd-oomd.service": {
                "name": "systemd-oomd.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "systemd-portabled.service": {
                "name": "systemd-portabled.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-poweroff.service": {
                "name": "systemd-poweroff.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-pstore.service": {
                "name": "systemd-pstore.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-quotacheck.service": {
                "name": "systemd-quotacheck.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-random-seed.service": {
                "name": "systemd-random-seed.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-reboot.service": {
                "name": "systemd-reboot.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-remount-fs.service": {
                "name": "systemd-remount-fs.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled-runtime"
            },
            "systemd-repart.service": {
                "name": "systemd-repart.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-resolved.service": {
                "name": "systemd-resolved.service",
                "source": "systemd",
                "state": "running",
                "status": "enabled"
            },
            "systemd-rfkill.service": {
                "name": "systemd-rfkill.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-suspend-then-hibernate.service": {
                "name": "systemd-suspend-then-hibernate.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-suspend.service": {
                "name": "systemd-suspend.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-sysctl.service": {
                "name": "systemd-sysctl.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-sysext.service": {
                "name": "systemd-sysext.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-sysusers.service": {
                "name": "systemd-sysusers.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-time-wait-sync.service": {
                "name": "systemd-time-wait-sync.service",
                "source": "systemd",
                "state": "inactive",
                "status": "disabled"
            },
            "systemd-timedated.service": {
                "name": "systemd-timedated.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-timesyncd.service": {
                "name": "systemd-timesyncd.service",
                "source": "systemd",
                "state": "stopped",
                "status": "disabled"
            },
            "systemd-tmpfiles-clean.service": {
                "name": "systemd-tmpfiles-clean.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup-dev.service": {
                "name": "systemd-tmpfiles-setup-dev.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-tmpfiles-setup.service": {
                "name": "systemd-tmpfiles-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udev-settle.service": {
                "name": "systemd-udev-settle.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udev-trigger.service": {
                "name": "systemd-udev-trigger.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-udevd.service": {
                "name": "systemd-udevd.service",
                "source": "systemd",
                "state": "running",
                "status": "static"
            },
            "systemd-update-done.service": {
                "name": "systemd-update-done.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-update-utmp-runlevel.service": {
                "name": "systemd-update-utmp-runlevel.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-update-utmp.service": {
                "name": "systemd-update-utmp.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-user-sessions.service": {
                "name": "systemd-user-sessions.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-userdbd.service": {
                "name": "systemd-userdbd.service",
                "source": "systemd",
                "state": "running",
                "status": "indirect"
            },
            "systemd-vconsole-setup.service": {
                "name": "systemd-vconsole-setup.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "systemd-volatile-root.service": {
                "name": "systemd-volatile-root.service",
                "source": "systemd",
                "state": "inactive",
                "status": "static"
            },
            "systemd-zram-setup@.service": {
                "name": "systemd-zram-setup@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "systemd-zram-setup@zram0.service": {
                "name": "systemd-zram-setup@zram0.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "udisks2.service": {
                "name": "udisks2.service",
                "source": "systemd",
                "state": "stopped",
                "status": "enabled"
            },
            "unbound-anchor.service": {
                "name": "unbound-anchor.service",
                "source": "systemd",
                "state": "stopped",
                "status": "static"
            },
            "user-runtime-dir@.service": {
                "name": "user-runtime-dir@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "user-runtime-dir@0.service": {
                "name": "user-runtime-dir@0.service",
                "source": "systemd",
                "state": "stopped",
                "status": "active"
            },
            "user@.service": {
                "name": "user@.service",
                "source": "systemd",
                "state": "unknown",
                "status": "static"
            },
            "user@0.service": {
                "name": "user@0.service",
                "source": "systemd",
                "state": "running",
                "status": "active"
            }
        }
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] *****
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53
Wednesday 06 July 2022  14:53:29 +0000 (0:00:01.891)       0:00:09.510 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_cryptsetup_services": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58
Wednesday 06 July 2022  14:53:29 +0000 (0:00:00.106)       0:00:09.616 ******** 

TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64
Wednesday 06 July 2022  14:53:29 +0000 (0:00:00.021)       0:00:09.638 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "actions": [],
    "changed": false,
    "crypts": [],
    "leaves": [],
    "mounts": [],
    "packages": [],
    "pools": [],
    "volumes": []
}

TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78
Wednesday 06 July 2022  14:53:30 +0000 (0:00:00.565)       0:00:10.203 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90
Wednesday 06 July 2022  14:53:30 +0000 (0:00:00.034)       0:00:10.238 ******** 

TASK [fedora.linux_system_roles.storage : show blivet_output] ******************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96
Wednesday 06 July 2022  14:53:30 +0000 (0:00:00.020)       0:00:10.259 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "blivet_output": {
        "actions": [],
        "changed": false,
        "crypts": [],
        "failed": false,
        "leaves": [],
        "mounts": [],
        "packages": [],
        "pools": [],
        "volumes": []
    }
}

TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101
Wednesday 06 July 2022  14:53:30 +0000 (0:00:00.036)       0:00:10.295 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105
Wednesday 06 July 2022  14:53:30 +0000 (0:00:00.033)       0:00:10.329 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : remove obsolete mounts] **************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121
Wednesday 06 July 2022  14:53:30 +0000 (0:00:00.035)       0:00:10.364 ******** 

TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132
Wednesday 06 July 2022  14:53:30 +0000 (0:00:00.034)       0:00:10.398 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : set up new/current mounts] ***********
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137
Wednesday 06 July 2022  14:53:30 +0000 (0:00:00.023)       0:00:10.422 ******** 

TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148
Wednesday 06 July 2022  14:53:30 +0000 (0:00:00.032)       0:00:10.454 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156
Wednesday 06 July 2022  14:53:30 +0000 (0:00:00.022)       0:00:10.477 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1657093385.4860332,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709",
        "ctime": 1657005647.423,
        "dev": 31,
        "device_type": 0,
        "executable": false,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 267,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/x-empty",
        "mode": "0600",
        "mtime": 1657005500.596,
        "nlink": 1,
        "path": "/etc/crypttab",
        "pw_name": "root",
        "readable": true,
        "rgrp": false,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": "10",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161
Wednesday 06 July 2022  14:53:31 +0000 (0:00:00.518)       0:00:10.995 ******** 

TASK [fedora.linux_system_roles.storage : Update facts] ************************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183
Wednesday 06 July 2022  14:53:31 +0000 (0:00:00.024)       0:00:11.020 ******** 
ok: [/cache/fedora-35.qcow2.snap]
META: role_complete for /cache/fedora-35.qcow2.snap

TASK [Mark tasks to be skipped] ************************************************
task path: /tmp/tmpus9dv81c/tests/storage/tests_remove_mount.yml:15
Wednesday 06 July 2022  14:53:32 +0000 (0:00:00.957)       0:00:11.977 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_skip_checks": [
            "blivet_available",
            "packages_installed",
            "service_facts"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/tests_remove_mount.yml:22
Wednesday 06 July 2022  14:53:32 +0000 (0:00:00.035)       0:00:12.012 ******** 
included: /tmp/tmpus9dv81c/tests/storage/get_unused_disk.yml for /cache/fedora-35.qcow2.snap

TASK [Find unused disks in the system] *****************************************
task path: /tmp/tmpus9dv81c/tests/storage/get_unused_disk.yml:2
Wednesday 06 July 2022  14:53:32 +0000 (0:00:00.035)       0:00:12.048 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "disks": [
        "sda"
    ]
}

TASK [Set unused_disks if necessary] *******************************************
task path: /tmp/tmpus9dv81c/tests/storage/get_unused_disk.yml:9
Wednesday 06 July 2022  14:53:32 +0000 (0:00:00.539)       0:00:12.588 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "unused_disks": [
            "sda"
        ]
    },
    "changed": false
}

TASK [Exit playbook when there's not enough unused disks in the system] ********
task path: /tmp/tmpus9dv81c/tests/storage/get_unused_disk.yml:14
Wednesday 06 July 2022  14:53:32 +0000 (0:00:00.037)       0:00:12.625 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Print unused disks] ******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/get_unused_disk.yml:19
Wednesday 06 July 2022  14:53:32 +0000 (0:00:00.037)       0:00:12.662 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "unused_disks": [
        "sda"
    ]
}

TASK [Create a LVM logical volume mounted at "/opt/test1"] *********************
task path: /tmp/tmpus9dv81c/tests/storage/tests_remove_mount.yml:27
Wednesday 06 July 2022  14:53:32 +0000 (0:00:00.061)       0:00:12.724 ******** 

TASK [fedora.linux_system_roles.storage : set platform/version specific variables] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2
Wednesday 06 July 2022  14:53:33 +0000 (0:00:00.072)       0:00:12.796 ******** 
included: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/fedora-35.qcow2.snap

TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2
Wednesday 06 July 2022  14:53:33 +0000 (0:00:00.035)       0:00:12.832 ******** 
ok: [/cache/fedora-35.qcow2.snap]

TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7
Wednesday 06 July 2022  14:53:33 +0000 (0:00:00.532)       0:00:13.364 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
ok: [/cache/fedora-35.qcow2.snap] => (item=Fedora.yml) => {
    "ansible_facts": {
        "_storage_copr_packages": [
            {
                "packages": [
                    "vdo",
                    "kmod-vdo"
                ],
                "repository": "rhawalsh/dm-vdo"
            }
        ],
        "_storage_copr_support_packages": [
            "dnf-plugins-core"
        ],
        "blivet_package_list": [
            "python3-blivet",
            "libblockdev-crypto",
            "libblockdev-dm",
            "libblockdev-lvm",
            "libblockdev-mdraid",
            "libblockdev-swap"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora.yml"
}
skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora_35.yml",
    "skip_reason": "Conditional result was False"
}
skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora_35.yml",
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5
Wednesday 06 July 2022  14:53:33 +0000 (0:00:00.059)       0:00:13.424 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9
Wednesday 06 July 2022  14:53:33 +0000 (0:00:00.033)       0:00:13.457 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13
Wednesday 06 July 2022  14:53:33 +0000 (0:00:00.033)       0:00:13.491 ******** 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
included: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/fedora-35.qcow2.snap

TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2
Wednesday 06 July 2022  14:53:33 +0000 (0:00:00.057)       0:00:13.549 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : make sure blivet is available] *******
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7
Wednesday 06 July 2022  14:53:33 +0000 (0:00:00.022)       0:00:13.571 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : show storage_pools] ******************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13
Wednesday 06 July 2022  14:53:33 +0000 (0:00:00.029)       0:00:13.601 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_pools": [
        {
            "disks": [
                "sda"
            ],
            "name": "foo",
            "volumes": [
                {
                    "mount_point": "/opt/test1",
                    "name": "test1",
                    "size": "3g"
                }
            ]
        }
    ]
}

TASK [fedora.linux_system_roles.storage : show storage_volumes] ****************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18
Wednesday 06 July 2022  14:53:33 +0000 (0:00:00.039)       0:00:13.640 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined"
}

TASK [fedora.linux_system_roles.storage : get required packages] ***************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23
Wednesday 06 July 2022  14:53:33 +0000 (0:00:00.040)       0:00:13.680 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35
Wednesday 06 July 2022  14:53:33 +0000 (0:00:00.030)       0:00:13.711 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : make sure required packages are installed] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41
Wednesday 06 July 2022  14:53:33 +0000 (0:00:00.029)       0:00:13.740 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : get service facts] *******************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47
Wednesday 06 July 2022  14:53:33 +0000 (0:00:00.029)       0:00:13.769 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] *****
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53
Wednesday 06 July 2022  14:53:34 +0000 (0:00:00.033)       0:00:13.802 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_cryptsetup_services": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58
Wednesday 06 July 2022  14:53:34 +0000 (0:00:00.058)       0:00:13.861 ******** 

TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64
Wednesday 06 July 2022  14:53:34 +0000 (0:00:00.022)       0:00:13.883 ******** 
changed: [/cache/fedora-35.qcow2.snap] => {
    "actions": [
        {
            "action": "create format",
            "device": "/dev/sda",
            "fs_type": "lvmpv"
        },
        {
            "action": "create device",
            "device": "/dev/foo",
            "fs_type": null
        },
        {
            "action": "create device",
            "device": "/dev/mapper/foo-test1",
            "fs_type": null
        },
        {
            "action": "create format",
            "device": "/dev/mapper/foo-test1",
            "fs_type": "xfs"
        }
    ],
    "changed": true,
    "crypts": [],
    "leaves": [
        "/dev/sr0",
        "/dev/vda1",
        "/dev/vda2",
        "/dev/vda3",
        "/dev/vda4",
        "/dev/vda5",
        "/dev/sdb",
        "/dev/sdc",
        "/dev/nvme0n1",
        "/dev/nvme1n1",
        "/dev/nvme2n1",
        "/dev/vdb1",
        "/dev/vdc1",
        "/dev/vdd",
        "/dev/vde",
        "/dev/vdf",
        "/dev/zram0",
        "/dev/mapper/foo-test1"
    ],
    "mounts": [
        {
            "dump": 0,
            "fstype": "xfs",
            "opts": "defaults",
            "passno": 0,
            "path": "/opt/test1",
            "src": "/dev/mapper/foo-test1",
            "state": "mounted"
        }
    ],
    "packages": [
        "xfsprogs",
        "e2fsprogs",
        "lvm2",
        "btrfs-progs",
        "dosfstools"
    ],
    "pools": [
        {
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "name": "foo",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "state": "present",
            "type": "lvm",
            "volumes": [
                {
                    "_device": "/dev/mapper/foo-test1",
                    "_kernel_device": "/dev/dm-0",
                    "_mount_id": "/dev/mapper/foo-test1",
                    "_raw_device": "/dev/mapper/foo-test1",
                    "_raw_kernel_device": "/dev/dm-0",
                    "cache_devices": [],
                    "cache_mode": null,
                    "cache_size": 0,
                    "cached": false,
                    "compression": null,
                    "deduplication": null,
                    "disks": [],
                    "encryption": false,
                    "encryption_cipher": null,
                    "encryption_key": null,
                    "encryption_key_size": null,
                    "encryption_luks_version": null,
                    "encryption_password": null,
                    "fs_create_options": "",
                    "fs_label": "",
                    "fs_overwrite_existing": true,
                    "fs_type": "xfs",
                    "mount_check": 0,
                    "mount_device_identifier": "uuid",
                    "mount_options": "defaults",
                    "mount_passno": 0,
                    "mount_point": "/opt/test1",
                    "name": "test1",
                    "raid_chunk_size": null,
                    "raid_device_count": null,
                    "raid_disks": [],
                    "raid_level": null,
                    "raid_metadata_version": null,
                    "raid_spare_count": null,
                    "size": "3g",
                    "state": "present",
                    "thin": false,
                    "thin_pool_name": null,
                    "thin_pool_size": null,
                    "type": "lvm",
                    "vdo_pool_size": null
                }
            ]
        }
    ],
    "volumes": []
}

TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78
Wednesday 06 July 2022  14:53:36 +0000 (0:00:02.581)       0:00:16.465 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90
Wednesday 06 July 2022  14:53:36 +0000 (0:00:00.038)       0:00:16.503 ******** 

TASK [fedora.linux_system_roles.storage : show blivet_output] ******************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96
Wednesday 06 July 2022  14:53:36 +0000 (0:00:00.022)       0:00:16.526 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "blivet_output": {
        "actions": [
            {
                "action": "create format",
                "device": "/dev/sda",
                "fs_type": "lvmpv"
            },
            {
                "action": "create device",
                "device": "/dev/foo",
                "fs_type": null
            },
            {
                "action": "create device",
                "device": "/dev/mapper/foo-test1",
                "fs_type": null
            },
            {
                "action": "create format",
                "device": "/dev/mapper/foo-test1",
                "fs_type": "xfs"
            }
        ],
        "changed": true,
        "crypts": [],
        "failed": false,
        "leaves": [
            "/dev/sr0",
            "/dev/vda1",
            "/dev/vda2",
            "/dev/vda3",
            "/dev/vda4",
            "/dev/vda5",
            "/dev/sdb",
            "/dev/sdc",
            "/dev/nvme0n1",
            "/dev/nvme1n1",
            "/dev/nvme2n1",
            "/dev/vdb1",
            "/dev/vdc1",
            "/dev/vdd",
            "/dev/vde",
            "/dev/vdf",
            "/dev/zram0",
            "/dev/mapper/foo-test1"
        ],
        "mounts": [
            {
                "dump": 0,
                "fstype": "xfs",
                "opts": "defaults",
                "passno": 0,
                "path": "/opt/test1",
                "src": "/dev/mapper/foo-test1",
                "state": "mounted"
            }
        ],
        "packages": [
            "xfsprogs",
            "e2fsprogs",
            "lvm2",
            "btrfs-progs",
            "dosfstools"
        ],
        "pools": [
            {
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "name": "foo",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "state": "present",
                "type": "lvm",
                "volumes": [
                    {
                        "_device": "/dev/mapper/foo-test1",
                        "_kernel_device": "/dev/dm-0",
                        "_mount_id": "/dev/mapper/foo-test1",
                        "_raw_device": "/dev/mapper/foo-test1",
                        "_raw_kernel_device": "/dev/dm-0",
                        "cache_devices": [],
                        "cache_mode": null,
                        "cache_size": 0,
                        "cached": false,
                        "compression": null,
                        "deduplication": null,
                        "disks": [],
                        "encryption": false,
                        "encryption_cipher": null,
                        "encryption_key": null,
                        "encryption_key_size": null,
                        "encryption_luks_version": null,
                        "encryption_password": null,
                        "fs_create_options": "",
                        "fs_label": "",
                        "fs_overwrite_existing": true,
                        "fs_type": "xfs",
                        "mount_check": 0,
                        "mount_device_identifier": "uuid",
                        "mount_options": "defaults",
                        "mount_passno": 0,
                        "mount_point": "/opt/test1",
                        "name": "test1",
                        "raid_chunk_size": null,
                        "raid_device_count": null,
                        "raid_disks": [],
                        "raid_level": null,
                        "raid_metadata_version": null,
                        "raid_spare_count": null,
                        "size": "3g",
                        "state": "present",
                        "thin": false,
                        "thin_pool_name": null,
                        "thin_pool_size": null,
                        "type": "lvm",
                        "vdo_pool_size": null
                    }
                ]
            }
        ],
        "volumes": []
    }
}

TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101
Wednesday 06 July 2022  14:53:36 +0000 (0:00:00.041)       0:00:16.568 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": [
            {
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "name": "foo",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "state": "present",
                "type": "lvm",
                "volumes": [
                    {
                        "_device": "/dev/mapper/foo-test1",
                        "_kernel_device": "/dev/dm-0",
                        "_mount_id": "/dev/mapper/foo-test1",
                        "_raw_device": "/dev/mapper/foo-test1",
                        "_raw_kernel_device": "/dev/dm-0",
                        "cache_devices": [],
                        "cache_mode": null,
                        "cache_size": 0,
                        "cached": false,
                        "compression": null,
                        "deduplication": null,
                        "disks": [],
                        "encryption": false,
                        "encryption_cipher": null,
                        "encryption_key": null,
                        "encryption_key_size": null,
                        "encryption_luks_version": null,
                        "encryption_password": null,
                        "fs_create_options": "",
                        "fs_label": "",
                        "fs_overwrite_existing": true,
                        "fs_type": "xfs",
                        "mount_check": 0,
                        "mount_device_identifier": "uuid",
                        "mount_options": "defaults",
                        "mount_passno": 0,
                        "mount_point": "/opt/test1",
                        "name": "test1",
                        "raid_chunk_size": null,
                        "raid_device_count": null,
                        "raid_disks": [],
                        "raid_level": null,
                        "raid_metadata_version": null,
                        "raid_spare_count": null,
                        "size": "3g",
                        "state": "present",
                        "thin": false,
                        "thin_pool_name": null,
                        "thin_pool_size": null,
                        "type": "lvm",
                        "vdo_pool_size": null
                    }
                ]
            }
        ]
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105
Wednesday 06 July 2022  14:53:36 +0000 (0:00:00.037)       0:00:16.605 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : remove obsolete mounts] **************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121
Wednesday 06 July 2022  14:53:36 +0000 (0:00:00.034)       0:00:16.640 ******** 

TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132
Wednesday 06 July 2022  14:53:36 +0000 (0:00:00.034)       0:00:16.675 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [fedora.linux_system_roles.storage : set up new/current mounts] ***********
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137
Wednesday 06 July 2022  14:53:37 +0000 (0:00:00.994)       0:00:17.669 ******** 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
changed: [/cache/fedora-35.qcow2.snap] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted'}) => {
    "ansible_loop_var": "mount_info",
    "backup_file": "",
    "boot": "yes",
    "changed": true,
    "dump": "0",
    "fstab": "/etc/fstab",
    "fstype": "xfs",
    "mount_info": {
        "dump": 0,
        "fstype": "xfs",
        "opts": "defaults",
        "passno": 0,
        "path": "/opt/test1",
        "src": "/dev/mapper/foo-test1",
        "state": "mounted"
    },
    "name": "/opt/test1",
    "opts": "defaults",
    "passno": "0",
    "src": "/dev/mapper/foo-test1"
}

TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148
Wednesday 06 July 2022  14:53:38 +0000 (0:00:00.729)       0:00:18.398 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156
Wednesday 06 July 2022  14:53:39 +0000 (0:00:00.789)       0:00:19.188 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1657093385.4860332,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709",
        "ctime": 1657005647.423,
        "dev": 31,
        "device_type": 0,
        "executable": false,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 267,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/x-empty",
        "mode": "0600",
        "mtime": 1657005500.596,
        "nlink": 1,
        "path": "/etc/crypttab",
        "pw_name": "root",
        "readable": true,
        "rgrp": false,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": "10",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161
Wednesday 06 July 2022  14:53:39 +0000 (0:00:00.414)       0:00:19.602 ******** 

TASK [fedora.linux_system_roles.storage : Update facts] ************************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183
Wednesday 06 July 2022  14:53:39 +0000 (0:00:00.025)       0:00:19.627 ******** 
ok: [/cache/fedora-35.qcow2.snap]
META: role_complete for /cache/fedora-35.qcow2.snap

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/tests_remove_mount.yml:39
Wednesday 06 July 2022  14:53:40 +0000 (0:00:00.921)       0:00:20.549 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml for /cache/fedora-35.qcow2.snap

TASK [Print out pool information] **********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:1
Wednesday 06 July 2022  14:53:40 +0000 (0:00:00.040)       0:00:20.590 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "_storage_pools_list": [
        {
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "name": "foo",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "state": "present",
            "type": "lvm",
            "volumes": [
                {
                    "_device": "/dev/mapper/foo-test1",
                    "_kernel_device": "/dev/dm-0",
                    "_mount_id": "/dev/mapper/foo-test1",
                    "_raw_device": "/dev/mapper/foo-test1",
                    "_raw_kernel_device": "/dev/dm-0",
                    "cache_devices": [],
                    "cache_mode": null,
                    "cache_size": 0,
                    "cached": false,
                    "compression": null,
                    "deduplication": null,
                    "disks": [],
                    "encryption": false,
                    "encryption_cipher": null,
                    "encryption_key": null,
                    "encryption_key_size": null,
                    "encryption_luks_version": null,
                    "encryption_password": null,
                    "fs_create_options": "",
                    "fs_label": "",
                    "fs_overwrite_existing": true,
                    "fs_type": "xfs",
                    "mount_check": 0,
                    "mount_device_identifier": "uuid",
                    "mount_options": "defaults",
                    "mount_passno": 0,
                    "mount_point": "/opt/test1",
                    "name": "test1",
                    "raid_chunk_size": null,
                    "raid_device_count": null,
                    "raid_disks": [],
                    "raid_level": null,
                    "raid_metadata_version": null,
                    "raid_spare_count": null,
                    "size": "3g",
                    "state": "present",
                    "thin": false,
                    "thin_pool_name": null,
                    "thin_pool_size": null,
                    "type": "lvm",
                    "vdo_pool_size": null
                }
            ]
        }
    ]
}

TASK [Print out volume information] ********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:6
Wednesday 06 July 2022  14:53:40 +0000 (0:00:00.046)       0:00:20.637 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Collect info about the volumes.] *****************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:14
Wednesday 06 July 2022  14:53:40 +0000 (0:00:00.034)       0:00:20.671 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "info": {
        "/dev/mapper/foo-test1": {
            "fstype": "xfs",
            "label": "",
            "name": "/dev/mapper/foo-test1",
            "size": "3G",
            "type": "lvm",
            "uuid": "1c41a782-c302-4977-99a1-bf5ce9244c3a"
        },
        "/dev/nvme0n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme0n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme1n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme1n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme2n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme2n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sda": {
            "fstype": "LVM2_member",
            "label": "",
            "name": "/dev/sda",
            "size": "10G",
            "type": "disk",
            "uuid": "t9n4nJ-EghS-hyi4-sfCo-76en-eXSi-MMptMD"
        },
        "/dev/sdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdb",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdc",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sr0": {
            "fstype": "iso9660",
            "label": "cidata",
            "name": "/dev/sr0",
            "size": "364K",
            "type": "rom",
            "uuid": "2022-07-06-14-53-07-00"
        },
        "/dev/vda": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda",
            "size": "4G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vda1": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda1",
            "size": "1M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda2": {
            "fstype": "ext4",
            "label": "boot",
            "name": "/dev/vda2",
            "size": "500M",
            "type": "partition",
            "uuid": "5f2f82d0-ae0a-4574-8811-62a31a51a870"
        },
        "/dev/vda3": {
            "fstype": "vfat",
            "label": "",
            "name": "/dev/vda3",
            "size": "100M",
            "type": "partition",
            "uuid": "5B84-6DD7"
        },
        "/dev/vda4": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda4",
            "size": "4M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda5": {
            "fstype": "btrfs",
            "label": "fedora",
            "name": "/dev/vda5",
            "size": "3.4G",
            "type": "partition",
            "uuid": "fbdaf05f-1a41-4dc5-b56e-a10edb430f9a"
        },
        "/dev/vdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdb",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdb1": {
            "fstype": "ext4",
            "label": "yumcache",
            "name": "/dev/vdb1",
            "size": "2G",
            "type": "partition",
            "uuid": "e676dfc5-3e4b-4331-8ede-73c3f56d2cab"
        },
        "/dev/vdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdc",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdc1": {
            "fstype": "ext4",
            "label": "yumvarlib",
            "name": "/dev/vdc1",
            "size": "2G",
            "type": "partition",
            "uuid": "0c299eb4-81f5-4414-b246-b95738eb82f0"
        },
        "/dev/vdd": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdd",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vde": {
            "fstype": "",
            "label": "",
            "name": "/dev/vde",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdf": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdf",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/zram0": {
            "fstype": "",
            "label": "",
            "name": "/dev/zram0",
            "size": "1.9G",
            "type": "disk",
            "uuid": ""
        }
    }
}

TASK [Read the /etc/fstab file for volume existence] ***************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:19
Wednesday 06 July 2022  14:53:41 +0000 (0:00:00.562)       0:00:21.233 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/fstab"
    ],
    "delta": "0:00:00.002846",
    "end": "2022-07-06 14:53:40.661300",
    "rc": 0,
    "start": "2022-07-06 14:53:40.658454"
}

STDOUT:


#
# /etc/fstab
# Created by anaconda on Tue Jul  5 07:18:20 2022
#
# Accessible filesystems, by reference, are maintained under '/dev/disk/'.
# See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info.
#
# After editing this file, run 'systemctl daemon-reload' to update systemd
# units generated from this file.
#
UUID=fbdaf05f-1a41-4dc5-b56e-a10edb430f9a /                       btrfs   subvol=root,compress=zstd:1 0 0
UUID=5f2f82d0-ae0a-4574-8811-62a31a51a870 /boot                   ext4    defaults        1 2
UUID=5B84-6DD7          /boot/efi               vfat    defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2
UUID=fbdaf05f-1a41-4dc5-b56e-a10edb430f9a /home                   btrfs   subvol=home,compress=zstd:1 0 0
/dev/vdb1	/var/cache/dnf	auto	defaults,nofail,comment=cloudconfig	0	2
/dev/vdc1	/var/lib/dnf	auto	defaults,nofail,comment=cloudconfig	0	2
/dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0

TASK [Read the /etc/crypttab file] *********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:24
Wednesday 06 July 2022  14:53:41 +0000 (0:00:00.503)       0:00:21.737 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/crypttab"
    ],
    "delta": "0:00:00.003057",
    "end": "2022-07-06 14:53:41.069843",
    "failed_when_result": false,
    "rc": 0,
    "start": "2022-07-06 14:53:41.066786"
}

TASK [Verify the volumes listed in storage_pools were correctly managed] *******
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:33
Wednesday 06 July 2022  14:53:42 +0000 (0:00:00.408)       0:00:22.146 ******** 
included: /tmp/tmpus9dv81c/tests/storage/test-verify-pool.yml for /cache/fedora-35.qcow2.snap => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}], 'raid_chunk_size': None})

TASK [Set _storage_pool_tests] *************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool.yml:5
Wednesday 06 July 2022  14:53:42 +0000 (0:00:00.063)       0:00:22.209 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pool_tests": [
            "members",
            "volumes"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool.yml:18
Wednesday 06 July 2022  14:53:42 +0000 (0:00:00.070)       0:00:22.280 ******** 
included: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml for /cache/fedora-35.qcow2.snap => (item=members)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-volumes.yml for /cache/fedora-35.qcow2.snap => (item=volumes)

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:1
Wednesday 06 July 2022  14:53:42 +0000 (0:00:00.082)       0:00:22.362 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_count": "1",
        "_storage_test_pool_pvs_lvm": [
            "/dev/sda"
        ]
    },
    "changed": false
}

TASK [Get the canonical device path for each member device] ********************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:6
Wednesday 06 July 2022  14:53:42 +0000 (0:00:00.064)       0:00:22.427 ******** 
ok: [/cache/fedora-35.qcow2.snap] => (item=/dev/sda) => {
    "ansible_loop_var": "pv",
    "changed": false,
    "device": "/dev/sda",
    "pv": "/dev/sda"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:15
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.489)       0:00:22.916 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "__pvs_lvm_len": "1"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:19
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.048)       0:00:22.965 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_pool_pvs": [
            "/dev/sda"
        ]
    },
    "changed": false
}

TASK [Verify PV count] *********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:23
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.049)       0:00:23.015 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:29
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.058)       0:00:23.074 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_type": "disk"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:33
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.039)       0:00:23.113 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_type": "disk"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:37
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.056)       0:00:23.169 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check the type of each PV] ***********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:41
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.026)       0:00:23.195 ******** 
ok: [/cache/fedora-35.qcow2.snap] => (item=/dev/sda) => {
    "ansible_loop_var": "pv",
    "changed": false,
    "pv": "/dev/sda"
}

MSG:

All assertions passed

TASK [Check MD RAID] ***********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:50
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.047)       0:00:23.243 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml for /cache/fedora-35.qcow2.snap

TASK [get information about RAID] **********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:6
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.045)       0:00:23.289 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:12
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.025)       0:00:23.314 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:16
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.026)       0:00:23.341 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:20
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.025)       0:00:23.366 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:24
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.023)       0:00:23.390 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:30
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.022)       0:00:23.413 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:36
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.022)       0:00:23.435 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:44
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.023)       0:00:23.459 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_md_active_devices_re": null,
        "storage_test_md_metadata_version_re": null,
        "storage_test_md_spare_devices_re": null
    },
    "changed": false
}

TASK [Check LVM RAID] **********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:53
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.032)       0:00:23.491 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-lvmraid.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member LVM RAID settings] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-lvmraid.yml:1
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.042)       0:00:23.534 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-lvmraid.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [Get information about LVM RAID] ******************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-lvmraid.yml:3
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.042)       0:00:23.577 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is LVM RAID] *******************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-lvmraid.yml:8
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.027)       0:00:23.605 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-lvmraid.yml:12
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.029)       0:00:23.634 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check Thin Pools] ********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:56
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.031)       0:00:23.666 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-thin.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member thinpool settings] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-thin.yml:1
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.044)       0:00:23.710 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-thin.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [Get information about thinpool] ******************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-thin.yml:3
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.043)       0:00:23.754 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is in correct thinpool (when thinp name is provided)] ***
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-thin.yml:8
Wednesday 06 July 2022  14:53:43 +0000 (0:00:00.023)       0:00:23.777 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is in thinpool (when thinp name is not provided)] ******
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-thin.yml:13
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.024)       0:00:23.802 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-thin.yml:17
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.022)       0:00:23.824 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check member encryption] *************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:59
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.055)       0:00:23.880 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-encryption.yml for /cache/fedora-35.qcow2.snap

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-encryption.yml:4
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.048)       0:00:23.928 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Validate pool member LUKS settings] **************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-encryption.yml:8
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.052)       0:00:23.980 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => (item=/dev/sda)  => {
    "_storage_test_pool_member_path": "/dev/sda",
    "ansible_loop_var": "_storage_test_pool_member_path",
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Validate pool member crypttab entries] ***********************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-encryption.yml:15
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.026)       0:00:24.006 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-crypttab.yml for /cache/fedora-35.qcow2.snap => (item=/dev/sda)

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-crypttab.yml:1
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.041)       0:00:24.048 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": []
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-crypttab.yml:4
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.051)       0:00:24.099 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-crypttab.yml:9
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.047)       0:00:24.147 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-crypttab.yml:15
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.035)       0:00:24.183 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-crypttab.yml:21
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.035)       0:00:24.218 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-crypttab.yml:27
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.040)       0:00:24.258 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-encryption.yml:22
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.036)       0:00:24.295 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_crypttab_key_file": null
    },
    "changed": false
}

TASK [Check VDO] ***************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:62
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.042)       0:00:24.337 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-vdo.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member VDO settings] ***************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-vdo.yml:1
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.047)       0:00:24.385 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [get information about VDO deduplication] *********************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:3
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.043)       0:00:24.428 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:8
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.025)       0:00:24.453 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:11
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.023)       0:00:24.477 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:16
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.023)       0:00:24.501 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:21
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.023)       0:00:24.524 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:24
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.026)       0:00:24.551 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:29
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.026)       0:00:24.577 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:39
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.024)       0:00:24.602 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_vdo_status": null
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:65
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.036)       0:00:24.638 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "__pvs_lvm_len": null,
        "_storage_test_expected_pv_count": null,
        "_storage_test_expected_pv_type": null,
        "_storage_test_pool_pvs": [],
        "_storage_test_pool_pvs_lvm": []
    },
    "changed": false
}

TASK [verify the volumes] ******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-volumes.yml:3
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.034)       0:00:24.673 ******** 
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume.yml:2
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.040)       0:00:24.714 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": true,
        "_storage_volume_tests": [
            "mount",
            "fstab",
            "fs",
            "device",
            "encryption",
            "md",
            "size",
            "cache"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume.yml:10
Wednesday 06 July 2022  14:53:44 +0000 (0:00:00.045)       0:00:24.759 ******** 
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml for /cache/fedora-35.qcow2.snap => (item=mount)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml for /cache/fedora-35.qcow2.snap => (item=fstab)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fs.yml for /cache/fedora-35.qcow2.snap => (item=fs)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml for /cache/fedora-35.qcow2.snap => (item=device)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml for /cache/fedora-35.qcow2.snap => (item=encryption)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml for /cache/fedora-35.qcow2.snap => (item=md)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml for /cache/fedora-35.qcow2.snap => (item=size)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml for /cache/fedora-35.qcow2.snap => (item=cache)

TASK [Get expected mount device based on device type] **************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:6
Wednesday 06 July 2022  14:53:45 +0000 (0:00:00.084)       0:00:24.844 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_device_path": "/dev/mapper/foo-test1"
    },
    "changed": false
}

TASK [Set some facts] **********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:10
Wednesday 06 July 2022  14:53:45 +0000 (0:00:00.041)       0:00:24.886 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": [
            {
                "block_available": 770146,
                "block_size": 4096,
                "block_total": 783872,
                "block_used": 13726,
                "device": "/dev/mapper/foo-test1",
                "fstype": "xfs",
                "inode_available": 1572861,
                "inode_total": 1572864,
                "inode_used": 3,
                "mount": "/opt/test1",
                "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota",
                "size_available": 3154518016,
                "size_total": 3210739712,
                "uuid": "1c41a782-c302-4977-99a1-bf5ce9244c3a"
            }
        ],
        "storage_test_mount_expected_match_count": "1",
        "storage_test_mount_point_matches": [
            {
                "block_available": 770146,
                "block_size": 4096,
                "block_total": 783872,
                "block_used": 13726,
                "device": "/dev/mapper/foo-test1",
                "fstype": "xfs",
                "inode_available": 1572861,
                "inode_total": 1572864,
                "inode_used": 3,
                "mount": "/opt/test1",
                "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota",
                "size_available": 3154518016,
                "size_total": 3210739712,
                "uuid": "1c41a782-c302-4977-99a1-bf5ce9244c3a"
            }
        ],
        "storage_test_swap_expected_matches": "0"
    },
    "changed": false
}

TASK [Verify the current mount state by device] ********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:20
Wednesday 06 July 2022  14:53:45 +0000 (0:00:00.058)       0:00:24.944 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the current mount state by mount point] ***************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:29
Wednesday 06 July 2022  14:53:45 +0000 (0:00:00.054)       0:00:24.999 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the mount fs type] ************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:37
Wednesday 06 July 2022  14:53:45 +0000 (0:00:00.080)       0:00:25.080 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [command] *****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:46
Wednesday 06 July 2022  14:53:45 +0000 (0:00:00.116)       0:00:25.196 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Gather swap info] ********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:50
Wednesday 06 July 2022  14:53:45 +0000 (0:00:00.024)       0:00:25.221 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify swap status] ******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:55
Wednesday 06 July 2022  14:53:45 +0000 (0:00:00.023)       0:00:25.245 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Unset facts] *************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:65
Wednesday 06 July 2022  14:53:45 +0000 (0:00:00.024)       0:00:25.270 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": null,
        "storage_test_mount_expected_match_count": null,
        "storage_test_mount_point_matches": null,
        "storage_test_swap_expected_matches": null,
        "storage_test_swaps": null,
        "storage_test_sys_node": null
    },
    "changed": false
}

TASK [Set some variables for fstab checking] ***********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml:2
Wednesday 06 July 2022  14:53:45 +0000 (0:00:00.039)       0:00:25.309 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": "1",
        "storage_test_fstab_expected_mount_options_matches": "1",
        "storage_test_fstab_expected_mount_point_matches": "1",
        "storage_test_fstab_id_matches": [
            "/dev/mapper/foo-test1 "
        ],
        "storage_test_fstab_mount_options_matches": [
            " /opt/test1 xfs defaults "
        ],
        "storage_test_fstab_mount_point_matches": [
            " /opt/test1 "
        ]
    },
    "changed": false
}

TASK [Verify that the device identifier appears in /etc/fstab] *****************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml:12
Wednesday 06 July 2022  14:53:45 +0000 (0:00:00.062)       0:00:25.372 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the fstab mount point] ********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml:19
Wednesday 06 July 2022  14:53:45 +0000 (0:00:00.051)       0:00:25.423 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify mount_options] ****************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml:25
Wednesday 06 July 2022  14:53:45 +0000 (0:00:00.050)       0:00:25.474 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up variables] ******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml:34
Wednesday 06 July 2022  14:53:45 +0000 (0:00:00.035)       0:00:25.509 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": null,
        "storage_test_fstab_expected_mount_options_matches": null,
        "storage_test_fstab_expected_mount_point_matches": null,
        "storage_test_fstab_id_matches": null,
        "storage_test_fstab_mount_options_matches": null,
        "storage_test_fstab_mount_point_matches": null
    },
    "changed": false
}

TASK [Verify fs type] **********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fs.yml:4
Wednesday 06 July 2022  14:53:45 +0000 (0:00:00.033)       0:00:25.543 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify fs label] *********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fs.yml:10
Wednesday 06 July 2022  14:53:45 +0000 (0:00:00.038)       0:00:25.581 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [See whether the device node is present] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:4
Wednesday 06 July 2022  14:53:45 +0000 (0:00:00.040)       0:00:25.622 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1657119217.3117163,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "ctime": 1657119215.3157163,
        "dev": 5,
        "device_type": 64768,
        "executable": false,
        "exists": true,
        "gid": 6,
        "gr_name": "disk",
        "inode": 505,
        "isblk": true,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": false,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/symlink",
        "mode": "0660",
        "mtime": 1657119215.3157163,
        "nlink": 1,
        "path": "/dev/mapper/foo-test1",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": null,
        "wgrp": true,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [Verify the presence/absence of the device node] **************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:10
Wednesday 06 July 2022  14:53:46 +0000 (0:00:00.401)       0:00:26.024 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about this volume] *********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:15
Wednesday 06 July 2022  14:53:46 +0000 (0:00:00.041)       0:00:26.065 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [(1/2) Process volume type (set initial value)] ***************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:21
Wednesday 06 July 2022  14:53:46 +0000 (0:00:00.041)       0:00:26.106 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "st_volume_type": "lvm"
    },
    "changed": false
}

TASK [(2/2) Process volume type (get RAID value)] ******************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:25
Wednesday 06 July 2022  14:53:46 +0000 (0:00:00.042)       0:00:26.149 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the volume's device type] *****************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:30
Wednesday 06 July 2022  14:53:46 +0000 (0:00:00.026)       0:00:26.175 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Stat the LUKS device, if encrypted] **************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:3
Wednesday 06 July 2022  14:53:46 +0000 (0:00:00.040)       0:00:26.216 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Ensure cryptsetup is present] ********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:10
Wednesday 06 July 2022  14:53:46 +0000 (0:00:00.025)       0:00:26.241 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [Collect LUKS info for this volume] ***************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:15
Wednesday 06 July 2022  14:53:48 +0000 (0:00:02.060)       0:00:28.302 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the presence/absence of the LUKS device node] *********************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:21
Wednesday 06 July 2022  14:53:48 +0000 (0:00:00.024)       0:00:28.326 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify that the raw device is the same as the device if not encrypted] ***
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:27
Wednesday 06 July 2022  14:53:48 +0000 (0:00:00.022)       0:00:28.348 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about the LUKS volume if encrypted] ****************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:33
Wednesday 06 July 2022  14:53:48 +0000 (0:00:00.093)       0:00:28.442 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the LUKS volume's device type if encrypted] ***********************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:39
Wednesday 06 July 2022  14:53:48 +0000 (0:00:00.023)       0:00:28.465 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS version] ******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:44
Wednesday 06 July 2022  14:53:48 +0000 (0:00:00.023)       0:00:28.489 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS key size] *****************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:50
Wednesday 06 July 2022  14:53:48 +0000 (0:00:00.023)       0:00:28.512 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS cipher] *******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:56
Wednesday 06 July 2022  14:53:48 +0000 (0:00:00.026)       0:00:28.538 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:62
Wednesday 06 July 2022  14:53:48 +0000 (0:00:00.025)       0:00:28.563 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": [],
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:67
Wednesday 06 July 2022  14:53:48 +0000 (0:00:00.051)       0:00:28.615 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:72
Wednesday 06 July 2022  14:53:48 +0000 (0:00:00.091)       0:00:28.706 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:78
Wednesday 06 July 2022  14:53:48 +0000 (0:00:00.035)       0:00:28.742 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:84
Wednesday 06 July 2022  14:53:49 +0000 (0:00:00.065)       0:00:28.808 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:90
Wednesday 06 July 2022  14:53:49 +0000 (0:00:00.037)       0:00:28.845 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_expected_crypttab_entries": null,
        "_storage_test_expected_crypttab_key_file": null
    },
    "changed": false
}

TASK [get information about RAID] **********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:7
Wednesday 06 July 2022  14:53:49 +0000 (0:00:00.067)       0:00:28.912 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:13
Wednesday 06 July 2022  14:53:49 +0000 (0:00:00.067)       0:00:28.979 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:17
Wednesday 06 July 2022  14:53:49 +0000 (0:00:00.037)       0:00:29.017 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:21
Wednesday 06 July 2022  14:53:49 +0000 (0:00:00.038)       0:00:29.055 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:25
Wednesday 06 July 2022  14:53:49 +0000 (0:00:00.039)       0:00:29.094 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:31
Wednesday 06 July 2022  14:53:49 +0000 (0:00:00.037)       0:00:29.132 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:37
Wednesday 06 July 2022  14:53:49 +0000 (0:00:00.036)       0:00:29.168 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the actual size of the volume] *************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:3
Wednesday 06 July 2022  14:53:49 +0000 (0:00:00.037)       0:00:29.206 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "bytes": 3221225472,
    "changed": false,
    "lvm": "3g",
    "parted": "3GiB",
    "size": "3 GiB"
}

TASK [parse the requested size of the volume] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:9
Wednesday 06 July 2022  14:53:49 +0000 (0:00:00.496)       0:00:29.702 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "bytes": 3221225472,
    "changed": false,
    "lvm": "3g",
    "parted": "3GiB",
    "size": "3 GiB"
}

TASK [Establish base value for expected size] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:15
Wednesday 06 July 2022  14:53:50 +0000 (0:00:00.410)       0:00:30.113 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_expected_size": "3221225472"
    },
    "changed": false
}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:20
Wednesday 06 July 2022  14:53:50 +0000 (0:00:00.050)       0:00:30.163 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_expected_size": "3221225472"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:25
Wednesday 06 July 2022  14:53:50 +0000 (0:00:00.034)       0:00:30.197 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:28
Wednesday 06 July 2022  14:53:50 +0000 (0:00:00.035)       0:00:30.233 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Get the size of parent/pool device] **************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:31
Wednesday 06 July 2022  14:53:50 +0000 (0:00:00.040)       0:00:30.273 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:36
Wednesday 06 July 2022  14:53:50 +0000 (0:00:00.039)       0:00:30.313 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Calculate the expected size based on pool size and percentage value] *****
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:39
Wednesday 06 July 2022  14:53:50 +0000 (0:00:00.036)       0:00:30.349 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:44
Wednesday 06 July 2022  14:53:50 +0000 (0:00:00.036)       0:00:30.386 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_actual_size": {
        "bytes": 3221225472,
        "changed": false,
        "failed": false,
        "lvm": "3g",
        "parted": "3GiB",
        "size": "3 GiB"
    }
}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:47
Wednesday 06 July 2022  14:53:50 +0000 (0:00:00.039)       0:00:30.426 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_expected_size": "3221225472"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:50
Wednesday 06 July 2022  14:53:50 +0000 (0:00:00.038)       0:00:30.464 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Get information about the LV] ********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:6
Wednesday 06 July 2022  14:53:50 +0000 (0:00:00.050)       0:00:30.515 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "lvs",
        "--noheadings",
        "--nameprefixes",
        "--units=b",
        "--nosuffix",
        "--unquoted",
        "-o",
        "name,attr,cache_total_blocks,chunk_size,segtype",
        "foo/test1"
    ],
    "delta": "0:00:00.040647",
    "end": "2022-07-06 14:53:49.872604",
    "rc": 0,
    "start": "2022-07-06 14:53:49.831957"
}

STDOUT:

  LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:14
Wednesday 06 July 2022  14:53:51 +0000 (0:00:00.435)       0:00:30.950 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_lv_segtype": [
            "linear"
        ]
    },
    "changed": false
}

TASK [check segment type] ******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:17
Wednesday 06 July 2022  14:53:51 +0000 (0:00:00.052)       0:00:31.002 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:22
Wednesday 06 July 2022  14:53:51 +0000 (0:00:00.054)       0:00:31.057 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested cache size] ******************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:26
Wednesday 06 July 2022  14:53:51 +0000 (0:00:00.041)       0:00:31.098 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:32
Wednesday 06 July 2022  14:53:51 +0000 (0:00:00.041)       0:00:31.140 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check cache size] ********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:36
Wednesday 06 July 2022  14:53:51 +0000 (0:00:00.039)       0:00:31.180 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up facts] **********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume.yml:16
Wednesday 06 July 2022  14:53:51 +0000 (0:00:00.039)       0:00:31.219 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": null
    },
    "changed": false
}

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:40
Wednesday 06 July 2022  14:53:51 +0000 (0:00:00.085)       0:00:31.304 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_pool": null
    },
    "changed": false
}

TASK [Verify the volumes with no pool were correctly managed] ******************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:47
Wednesday 06 July 2022  14:53:51 +0000 (0:00:00.036)       0:00:31.340 ******** 

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:57
Wednesday 06 July 2022  14:53:51 +0000 (0:00:00.022)       0:00:31.362 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_blkinfo": null,
        "storage_test_crypttab": null,
        "storage_test_fstab": null,
        "storage_test_volume": null
    },
    "changed": false
}

TASK [Change the mount location to ""] *****************************************
task path: /tmp/tmpus9dv81c/tests/storage/tests_remove_mount.yml:41
Wednesday 06 July 2022  14:53:51 +0000 (0:00:00.081)       0:00:31.444 ******** 

TASK [fedora.linux_system_roles.storage : set platform/version specific variables] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2
Wednesday 06 July 2022  14:53:51 +0000 (0:00:00.086)       0:00:31.530 ******** 
included: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/fedora-35.qcow2.snap

TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2
Wednesday 06 July 2022  14:53:51 +0000 (0:00:00.039)       0:00:31.569 ******** 
ok: [/cache/fedora-35.qcow2.snap]

TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7
Wednesday 06 July 2022  14:53:52 +0000 (0:00:00.560)       0:00:32.129 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
ok: [/cache/fedora-35.qcow2.snap] => (item=Fedora.yml) => {
    "ansible_facts": {
        "_storage_copr_packages": [
            {
                "packages": [
                    "vdo",
                    "kmod-vdo"
                ],
                "repository": "rhawalsh/dm-vdo"
            }
        ],
        "_storage_copr_support_packages": [
            "dnf-plugins-core"
        ],
        "blivet_package_list": [
            "python3-blivet",
            "libblockdev-crypto",
            "libblockdev-dm",
            "libblockdev-lvm",
            "libblockdev-mdraid",
            "libblockdev-swap"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora.yml"
}
skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora_35.yml",
    "skip_reason": "Conditional result was False"
}
skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora_35.yml",
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5
Wednesday 06 July 2022  14:53:52 +0000 (0:00:00.069)       0:00:32.199 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9
Wednesday 06 July 2022  14:53:52 +0000 (0:00:00.044)       0:00:32.244 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13
Wednesday 06 July 2022  14:53:52 +0000 (0:00:00.039)       0:00:32.284 ******** 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
included: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/fedora-35.qcow2.snap

TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2
Wednesday 06 July 2022  14:53:52 +0000 (0:00:00.057)       0:00:32.341 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : make sure blivet is available] *******
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7
Wednesday 06 July 2022  14:53:52 +0000 (0:00:00.021)       0:00:32.362 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : show storage_pools] ******************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13
Wednesday 06 July 2022  14:53:52 +0000 (0:00:00.030)       0:00:32.393 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_pools": [
        {
            "disks": [
                "sda"
            ],
            "name": "foo",
            "volumes": [
                {
                    "mount_point": "",
                    "name": "test1",
                    "size": "3g"
                }
            ]
        }
    ]
}

TASK [fedora.linux_system_roles.storage : show storage_volumes] ****************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18
Wednesday 06 July 2022  14:53:52 +0000 (0:00:00.038)       0:00:32.432 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined"
}

TASK [fedora.linux_system_roles.storage : get required packages] ***************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23
Wednesday 06 July 2022  14:53:52 +0000 (0:00:00.037)       0:00:32.469 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35
Wednesday 06 July 2022  14:53:52 +0000 (0:00:00.027)       0:00:32.497 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : make sure required packages are installed] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41
Wednesday 06 July 2022  14:53:52 +0000 (0:00:00.030)       0:00:32.528 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : get service facts] *******************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47
Wednesday 06 July 2022  14:53:52 +0000 (0:00:00.027)       0:00:32.555 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] *****
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53
Wednesday 06 July 2022  14:53:52 +0000 (0:00:00.028)       0:00:32.583 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_cryptsetup_services": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58
Wednesday 06 July 2022  14:53:52 +0000 (0:00:00.060)       0:00:32.644 ******** 

TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64
Wednesday 06 July 2022  14:53:52 +0000 (0:00:00.023)       0:00:32.667 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "actions": [],
    "changed": false,
    "crypts": [],
    "leaves": [
        "/dev/sr0",
        "/dev/vda1",
        "/dev/vda2",
        "/dev/vda3",
        "/dev/vda4",
        "/dev/vda5",
        "/dev/mapper/foo-test1",
        "/dev/sdb",
        "/dev/sdc",
        "/dev/nvme0n1",
        "/dev/nvme1n1",
        "/dev/nvme2n1",
        "/dev/vdb1",
        "/dev/vdc1",
        "/dev/vdd",
        "/dev/vde",
        "/dev/vdf",
        "/dev/zram0"
    ],
    "mounts": [
        {
            "path": "/opt/test1",
            "state": "absent"
        }
    ],
    "packages": [
        "xfsprogs",
        "lvm2",
        "e2fsprogs",
        "btrfs-progs",
        "dosfstools"
    ],
    "pools": [
        {
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "name": "foo",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "state": "present",
            "type": "lvm",
            "volumes": [
                {
                    "_device": "/dev/mapper/foo-test1",
                    "_kernel_device": "/dev/dm-0",
                    "_mount_id": "/dev/mapper/foo-test1",
                    "_raw_device": "/dev/mapper/foo-test1",
                    "_raw_kernel_device": "/dev/dm-0",
                    "cache_devices": [],
                    "cache_mode": null,
                    "cache_size": 0,
                    "cached": false,
                    "compression": null,
                    "deduplication": null,
                    "disks": [
                        "sda"
                    ],
                    "encryption": false,
                    "encryption_cipher": null,
                    "encryption_key": null,
                    "encryption_key_size": null,
                    "encryption_luks_version": null,
                    "encryption_password": null,
                    "fs_create_options": "",
                    "fs_label": "",
                    "fs_overwrite_existing": true,
                    "fs_type": "xfs",
                    "mount_check": 0,
                    "mount_device_identifier": "uuid",
                    "mount_options": "defaults",
                    "mount_passno": 0,
                    "mount_point": "",
                    "name": "test1",
                    "raid_chunk_size": null,
                    "raid_device_count": null,
                    "raid_disks": [],
                    "raid_level": null,
                    "raid_metadata_version": null,
                    "raid_spare_count": null,
                    "size": "3g",
                    "state": "present",
                    "thin": false,
                    "thin_pool_name": null,
                    "thin_pool_size": null,
                    "type": "lvm",
                    "vdo_pool_size": null
                }
            ]
        }
    ],
    "volumes": []
}

TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78
Wednesday 06 July 2022  14:53:54 +0000 (0:00:02.034)       0:00:34.702 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90
Wednesday 06 July 2022  14:53:55 +0000 (0:00:00.097)       0:00:34.799 ******** 

TASK [fedora.linux_system_roles.storage : show blivet_output] ******************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96
Wednesday 06 July 2022  14:53:55 +0000 (0:00:00.023)       0:00:34.822 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "blivet_output": {
        "actions": [],
        "changed": false,
        "crypts": [],
        "failed": false,
        "leaves": [
            "/dev/sr0",
            "/dev/vda1",
            "/dev/vda2",
            "/dev/vda3",
            "/dev/vda4",
            "/dev/vda5",
            "/dev/mapper/foo-test1",
            "/dev/sdb",
            "/dev/sdc",
            "/dev/nvme0n1",
            "/dev/nvme1n1",
            "/dev/nvme2n1",
            "/dev/vdb1",
            "/dev/vdc1",
            "/dev/vdd",
            "/dev/vde",
            "/dev/vdf",
            "/dev/zram0"
        ],
        "mounts": [
            {
                "path": "/opt/test1",
                "state": "absent"
            }
        ],
        "packages": [
            "xfsprogs",
            "lvm2",
            "e2fsprogs",
            "btrfs-progs",
            "dosfstools"
        ],
        "pools": [
            {
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "name": "foo",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "state": "present",
                "type": "lvm",
                "volumes": [
                    {
                        "_device": "/dev/mapper/foo-test1",
                        "_kernel_device": "/dev/dm-0",
                        "_mount_id": "/dev/mapper/foo-test1",
                        "_raw_device": "/dev/mapper/foo-test1",
                        "_raw_kernel_device": "/dev/dm-0",
                        "cache_devices": [],
                        "cache_mode": null,
                        "cache_size": 0,
                        "cached": false,
                        "compression": null,
                        "deduplication": null,
                        "disks": [
                            "sda"
                        ],
                        "encryption": false,
                        "encryption_cipher": null,
                        "encryption_key": null,
                        "encryption_key_size": null,
                        "encryption_luks_version": null,
                        "encryption_password": null,
                        "fs_create_options": "",
                        "fs_label": "",
                        "fs_overwrite_existing": true,
                        "fs_type": "xfs",
                        "mount_check": 0,
                        "mount_device_identifier": "uuid",
                        "mount_options": "defaults",
                        "mount_passno": 0,
                        "mount_point": "",
                        "name": "test1",
                        "raid_chunk_size": null,
                        "raid_device_count": null,
                        "raid_disks": [],
                        "raid_level": null,
                        "raid_metadata_version": null,
                        "raid_spare_count": null,
                        "size": "3g",
                        "state": "present",
                        "thin": false,
                        "thin_pool_name": null,
                        "thin_pool_size": null,
                        "type": "lvm",
                        "vdo_pool_size": null
                    }
                ]
            }
        ],
        "volumes": []
    }
}

TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101
Wednesday 06 July 2022  14:53:55 +0000 (0:00:00.039)       0:00:34.862 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": [
            {
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "name": "foo",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "state": "present",
                "type": "lvm",
                "volumes": [
                    {
                        "_device": "/dev/mapper/foo-test1",
                        "_kernel_device": "/dev/dm-0",
                        "_mount_id": "/dev/mapper/foo-test1",
                        "_raw_device": "/dev/mapper/foo-test1",
                        "_raw_kernel_device": "/dev/dm-0",
                        "cache_devices": [],
                        "cache_mode": null,
                        "cache_size": 0,
                        "cached": false,
                        "compression": null,
                        "deduplication": null,
                        "disks": [
                            "sda"
                        ],
                        "encryption": false,
                        "encryption_cipher": null,
                        "encryption_key": null,
                        "encryption_key_size": null,
                        "encryption_luks_version": null,
                        "encryption_password": null,
                        "fs_create_options": "",
                        "fs_label": "",
                        "fs_overwrite_existing": true,
                        "fs_type": "xfs",
                        "mount_check": 0,
                        "mount_device_identifier": "uuid",
                        "mount_options": "defaults",
                        "mount_passno": 0,
                        "mount_point": "",
                        "name": "test1",
                        "raid_chunk_size": null,
                        "raid_device_count": null,
                        "raid_disks": [],
                        "raid_level": null,
                        "raid_metadata_version": null,
                        "raid_spare_count": null,
                        "size": "3g",
                        "state": "present",
                        "thin": false,
                        "thin_pool_name": null,
                        "thin_pool_size": null,
                        "type": "lvm",
                        "vdo_pool_size": null
                    }
                ]
            }
        ]
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105
Wednesday 06 July 2022  14:53:55 +0000 (0:00:00.042)       0:00:34.905 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : remove obsolete mounts] **************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121
Wednesday 06 July 2022  14:53:55 +0000 (0:00:00.038)       0:00:34.943 ******** 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
changed: [/cache/fedora-35.qcow2.snap] => (item={'path': '/opt/test1', 'state': 'absent'}) => {
    "ansible_loop_var": "mount_info",
    "backup_file": "",
    "boot": "yes",
    "changed": true,
    "dump": "0",
    "fstab": "/etc/fstab",
    "mount_info": {
        "path": "/opt/test1",
        "state": "absent"
    },
    "name": "/opt/test1",
    "opts": "defaults",
    "passno": "0"
}

TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132
Wednesday 06 July 2022  14:53:55 +0000 (0:00:00.425)       0:00:35.368 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [fedora.linux_system_roles.storage : set up new/current mounts] ***********
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137
Wednesday 06 July 2022  14:53:56 +0000 (0:00:00.727)       0:00:36.096 ******** 

TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148
Wednesday 06 July 2022  14:53:56 +0000 (0:00:00.040)       0:00:36.137 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "name": null,
    "status": {}
}

TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156
Wednesday 06 July 2022  14:53:57 +0000 (0:00:00.703)       0:00:36.840 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1657093385.4860332,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709",
        "ctime": 1657005647.423,
        "dev": 31,
        "device_type": 0,
        "executable": false,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 267,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/x-empty",
        "mode": "0600",
        "mtime": 1657005500.596,
        "nlink": 1,
        "path": "/etc/crypttab",
        "pw_name": "root",
        "readable": true,
        "rgrp": false,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": "10",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161
Wednesday 06 July 2022  14:53:57 +0000 (0:00:00.408)       0:00:37.249 ******** 

TASK [fedora.linux_system_roles.storage : Update facts] ************************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183
Wednesday 06 July 2022  14:53:57 +0000 (0:00:00.024)       0:00:37.274 ******** 
ok: [/cache/fedora-35.qcow2.snap]
META: role_complete for /cache/fedora-35.qcow2.snap

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/tests_remove_mount.yml:53
Wednesday 06 July 2022  14:53:58 +0000 (0:00:00.922)       0:00:38.196 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml for /cache/fedora-35.qcow2.snap

TASK [Print out pool information] **********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:1
Wednesday 06 July 2022  14:53:58 +0000 (0:00:00.043)       0:00:38.240 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "_storage_pools_list": [
        {
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "name": "foo",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "state": "present",
            "type": "lvm",
            "volumes": [
                {
                    "_device": "/dev/mapper/foo-test1",
                    "_kernel_device": "/dev/dm-0",
                    "_mount_id": "/dev/mapper/foo-test1",
                    "_raw_device": "/dev/mapper/foo-test1",
                    "_raw_kernel_device": "/dev/dm-0",
                    "cache_devices": [],
                    "cache_mode": null,
                    "cache_size": 0,
                    "cached": false,
                    "compression": null,
                    "deduplication": null,
                    "disks": [
                        "sda"
                    ],
                    "encryption": false,
                    "encryption_cipher": null,
                    "encryption_key": null,
                    "encryption_key_size": null,
                    "encryption_luks_version": null,
                    "encryption_password": null,
                    "fs_create_options": "",
                    "fs_label": "",
                    "fs_overwrite_existing": true,
                    "fs_type": "xfs",
                    "mount_check": 0,
                    "mount_device_identifier": "uuid",
                    "mount_options": "defaults",
                    "mount_passno": 0,
                    "mount_point": "",
                    "name": "test1",
                    "raid_chunk_size": null,
                    "raid_device_count": null,
                    "raid_disks": [],
                    "raid_level": null,
                    "raid_metadata_version": null,
                    "raid_spare_count": null,
                    "size": "3g",
                    "state": "present",
                    "thin": false,
                    "thin_pool_name": null,
                    "thin_pool_size": null,
                    "type": "lvm",
                    "vdo_pool_size": null
                }
            ]
        }
    ]
}

TASK [Print out volume information] ********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:6
Wednesday 06 July 2022  14:53:58 +0000 (0:00:00.055)       0:00:38.295 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Collect info about the volumes.] *****************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:14
Wednesday 06 July 2022  14:53:58 +0000 (0:00:00.036)       0:00:38.332 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "info": {
        "/dev/mapper/foo-test1": {
            "fstype": "xfs",
            "label": "",
            "name": "/dev/mapper/foo-test1",
            "size": "3G",
            "type": "lvm",
            "uuid": "1c41a782-c302-4977-99a1-bf5ce9244c3a"
        },
        "/dev/nvme0n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme0n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme1n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme1n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme2n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme2n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sda": {
            "fstype": "LVM2_member",
            "label": "",
            "name": "/dev/sda",
            "size": "10G",
            "type": "disk",
            "uuid": "t9n4nJ-EghS-hyi4-sfCo-76en-eXSi-MMptMD"
        },
        "/dev/sdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdb",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdc",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sr0": {
            "fstype": "iso9660",
            "label": "cidata",
            "name": "/dev/sr0",
            "size": "364K",
            "type": "rom",
            "uuid": "2022-07-06-14-53-07-00"
        },
        "/dev/vda": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda",
            "size": "4G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vda1": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda1",
            "size": "1M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda2": {
            "fstype": "ext4",
            "label": "boot",
            "name": "/dev/vda2",
            "size": "500M",
            "type": "partition",
            "uuid": "5f2f82d0-ae0a-4574-8811-62a31a51a870"
        },
        "/dev/vda3": {
            "fstype": "vfat",
            "label": "",
            "name": "/dev/vda3",
            "size": "100M",
            "type": "partition",
            "uuid": "5B84-6DD7"
        },
        "/dev/vda4": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda4",
            "size": "4M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda5": {
            "fstype": "btrfs",
            "label": "fedora",
            "name": "/dev/vda5",
            "size": "3.4G",
            "type": "partition",
            "uuid": "fbdaf05f-1a41-4dc5-b56e-a10edb430f9a"
        },
        "/dev/vdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdb",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdb1": {
            "fstype": "ext4",
            "label": "yumcache",
            "name": "/dev/vdb1",
            "size": "2G",
            "type": "partition",
            "uuid": "e676dfc5-3e4b-4331-8ede-73c3f56d2cab"
        },
        "/dev/vdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdc",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdc1": {
            "fstype": "ext4",
            "label": "yumvarlib",
            "name": "/dev/vdc1",
            "size": "2G",
            "type": "partition",
            "uuid": "0c299eb4-81f5-4414-b246-b95738eb82f0"
        },
        "/dev/vdd": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdd",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vde": {
            "fstype": "",
            "label": "",
            "name": "/dev/vde",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdf": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdf",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/zram0": {
            "fstype": "",
            "label": "",
            "name": "/dev/zram0",
            "size": "1.9G",
            "type": "disk",
            "uuid": ""
        }
    }
}

TASK [Read the /etc/fstab file for volume existence] ***************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:19
Wednesday 06 July 2022  14:53:58 +0000 (0:00:00.436)       0:00:38.769 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/fstab"
    ],
    "delta": "0:00:00.004064",
    "end": "2022-07-06 14:53:58.128568",
    "rc": 0,
    "start": "2022-07-06 14:53:58.124504"
}

STDOUT:


#
# /etc/fstab
# Created by anaconda on Tue Jul  5 07:18:20 2022
#
# Accessible filesystems, by reference, are maintained under '/dev/disk/'.
# See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info.
#
# After editing this file, run 'systemctl daemon-reload' to update systemd
# units generated from this file.
#
UUID=fbdaf05f-1a41-4dc5-b56e-a10edb430f9a /                       btrfs   subvol=root,compress=zstd:1 0 0
UUID=5f2f82d0-ae0a-4574-8811-62a31a51a870 /boot                   ext4    defaults        1 2
UUID=5B84-6DD7          /boot/efi               vfat    defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2
UUID=fbdaf05f-1a41-4dc5-b56e-a10edb430f9a /home                   btrfs   subvol=home,compress=zstd:1 0 0
/dev/vdb1	/var/cache/dnf	auto	defaults,nofail,comment=cloudconfig	0	2
/dev/vdc1	/var/lib/dnf	auto	defaults,nofail,comment=cloudconfig	0	2

TASK [Read the /etc/crypttab file] *********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:24
Wednesday 06 July 2022  14:53:59 +0000 (0:00:00.438)       0:00:39.207 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/crypttab"
    ],
    "delta": "0:00:00.003617",
    "end": "2022-07-06 14:53:58.533708",
    "failed_when_result": false,
    "rc": 0,
    "start": "2022-07-06 14:53:58.530091"
}

TASK [Verify the volumes listed in storage_pools were correctly managed] *******
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:33
Wednesday 06 July 2022  14:53:59 +0000 (0:00:00.405)       0:00:39.612 ******** 
[WARNING]: The loop variable 'storage_test_pool' is already in use. You should
set the `loop_var` value in the `loop_control` option for the task to something
else to avoid variable collisions and unexpected behavior.
included: /tmp/tmpus9dv81c/tests/storage/test-verify-pool.yml for /cache/fedora-35.qcow2.snap => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}], 'raid_chunk_size': None})

TASK [Set _storage_pool_tests] *************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool.yml:5
Wednesday 06 July 2022  14:53:59 +0000 (0:00:00.059)       0:00:39.672 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pool_tests": [
            "members",
            "volumes"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool.yml:18
Wednesday 06 July 2022  14:53:59 +0000 (0:00:00.035)       0:00:39.708 ******** 
included: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml for /cache/fedora-35.qcow2.snap => (item=members)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-volumes.yml for /cache/fedora-35.qcow2.snap => (item=volumes)

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:1
Wednesday 06 July 2022  14:53:59 +0000 (0:00:00.048)       0:00:39.757 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_count": "1",
        "_storage_test_pool_pvs_lvm": [
            "/dev/sda"
        ]
    },
    "changed": false
}

TASK [Get the canonical device path for each member device] ********************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:6
Wednesday 06 July 2022  14:54:00 +0000 (0:00:00.053)       0:00:39.810 ******** 
ok: [/cache/fedora-35.qcow2.snap] => (item=/dev/sda) => {
    "ansible_loop_var": "pv",
    "changed": false,
    "device": "/dev/sda",
    "pv": "/dev/sda"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:15
Wednesday 06 July 2022  14:54:00 +0000 (0:00:00.407)       0:00:40.218 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "__pvs_lvm_len": "1"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:19
Wednesday 06 July 2022  14:54:00 +0000 (0:00:00.051)       0:00:40.269 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_pool_pvs": [
            "/dev/sda"
        ]
    },
    "changed": false
}

TASK [Verify PV count] *********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:23
Wednesday 06 July 2022  14:54:00 +0000 (0:00:00.051)       0:00:40.321 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:29
Wednesday 06 July 2022  14:54:00 +0000 (0:00:00.051)       0:00:40.373 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_type": "disk"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:33
Wednesday 06 July 2022  14:54:00 +0000 (0:00:00.037)       0:00:40.410 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_type": "disk"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:37
Wednesday 06 July 2022  14:54:00 +0000 (0:00:00.048)       0:00:40.459 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check the type of each PV] ***********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:41
Wednesday 06 July 2022  14:54:00 +0000 (0:00:00.026)       0:00:40.485 ******** 
ok: [/cache/fedora-35.qcow2.snap] => (item=/dev/sda) => {
    "ansible_loop_var": "pv",
    "changed": false,
    "pv": "/dev/sda"
}

MSG:

All assertions passed

TASK [Check MD RAID] ***********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:50
Wednesday 06 July 2022  14:54:00 +0000 (0:00:00.042)       0:00:40.528 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml for /cache/fedora-35.qcow2.snap

TASK [get information about RAID] **********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:6
Wednesday 06 July 2022  14:54:00 +0000 (0:00:00.042)       0:00:40.570 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:12
Wednesday 06 July 2022  14:54:00 +0000 (0:00:00.025)       0:00:40.596 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:16
Wednesday 06 July 2022  14:54:00 +0000 (0:00:00.025)       0:00:40.622 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:20
Wednesday 06 July 2022  14:54:00 +0000 (0:00:00.024)       0:00:40.646 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:24
Wednesday 06 July 2022  14:54:00 +0000 (0:00:00.025)       0:00:40.671 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:30
Wednesday 06 July 2022  14:54:00 +0000 (0:00:00.027)       0:00:40.698 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:36
Wednesday 06 July 2022  14:54:00 +0000 (0:00:00.025)       0:00:40.724 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:44
Wednesday 06 July 2022  14:54:00 +0000 (0:00:00.024)       0:00:40.748 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_md_active_devices_re": null,
        "storage_test_md_metadata_version_re": null,
        "storage_test_md_spare_devices_re": null
    },
    "changed": false
}

TASK [Check LVM RAID] **********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:53
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.071)       0:00:40.820 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-lvmraid.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member LVM RAID settings] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-lvmraid.yml:1
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.045)       0:00:40.865 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-lvmraid.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [Get information about LVM RAID] ******************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-lvmraid.yml:3
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.074)       0:00:40.940 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is LVM RAID] *******************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-lvmraid.yml:8
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.028)       0:00:40.969 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-lvmraid.yml:12
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.027)       0:00:40.996 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check Thin Pools] ********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:56
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.030)       0:00:41.027 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-thin.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member thinpool settings] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-thin.yml:1
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.047)       0:00:41.074 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-thin.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [Get information about thinpool] ******************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-thin.yml:3
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.044)       0:00:41.119 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is in correct thinpool (when thinp name is provided)] ***
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-thin.yml:8
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.026)       0:00:41.145 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is in thinpool (when thinp name is not provided)] ******
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-thin.yml:13
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.025)       0:00:41.171 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-thin.yml:17
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.024)       0:00:41.195 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check member encryption] *************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:59
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.022)       0:00:41.218 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-encryption.yml for /cache/fedora-35.qcow2.snap

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-encryption.yml:4
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.046)       0:00:41.264 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Validate pool member LUKS settings] **************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-encryption.yml:8
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.051)       0:00:41.316 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => (item=/dev/sda)  => {
    "_storage_test_pool_member_path": "/dev/sda",
    "ansible_loop_var": "_storage_test_pool_member_path",
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Validate pool member crypttab entries] ***********************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-encryption.yml:15
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.029)       0:00:41.345 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-crypttab.yml for /cache/fedora-35.qcow2.snap => (item=/dev/sda)

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-crypttab.yml:1
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.043)       0:00:41.389 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": []
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-crypttab.yml:4
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.055)       0:00:41.444 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-crypttab.yml:9
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.055)       0:00:41.500 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-crypttab.yml:15
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.039)       0:00:41.539 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-crypttab.yml:21
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.039)       0:00:41.578 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-crypttab.yml:27
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.050)       0:00:41.629 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-encryption.yml:22
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.041)       0:00:41.671 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_crypttab_key_file": null
    },
    "changed": false
}

TASK [Check VDO] ***************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:62
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.034)       0:00:41.706 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-vdo.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member VDO settings] ***************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-vdo.yml:1
Wednesday 06 July 2022  14:54:01 +0000 (0:00:00.048)       0:00:41.754 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [get information about VDO deduplication] *********************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:3
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.044)       0:00:41.799 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:8
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.025)       0:00:41.824 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:11
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.027)       0:00:41.851 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:16
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.031)       0:00:41.883 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:21
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.029)       0:00:41.912 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:24
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.025)       0:00:41.938 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:29
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.027)       0:00:41.966 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:39
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.029)       0:00:41.995 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_vdo_status": null
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:65
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.036)       0:00:42.032 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "__pvs_lvm_len": null,
        "_storage_test_expected_pv_count": null,
        "_storage_test_expected_pv_type": null,
        "_storage_test_pool_pvs": [],
        "_storage_test_pool_pvs_lvm": []
    },
    "changed": false
}

TASK [verify the volumes] ******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-volumes.yml:3
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.037)       0:00:42.070 ******** 
[WARNING]: The loop variable 'storage_test_volume' is already in use. You
should set the `loop_var` value in the `loop_control` option for the task to
something else to avoid variable collisions and unexpected behavior.
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume.yml:2
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.045)       0:00:42.115 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": true,
        "_storage_volume_tests": [
            "mount",
            "fstab",
            "fs",
            "device",
            "encryption",
            "md",
            "size",
            "cache"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume.yml:10
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.103)       0:00:42.218 ******** 
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml for /cache/fedora-35.qcow2.snap => (item=mount)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml for /cache/fedora-35.qcow2.snap => (item=fstab)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fs.yml for /cache/fedora-35.qcow2.snap => (item=fs)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml for /cache/fedora-35.qcow2.snap => (item=device)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml for /cache/fedora-35.qcow2.snap => (item=encryption)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml for /cache/fedora-35.qcow2.snap => (item=md)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml for /cache/fedora-35.qcow2.snap => (item=size)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml for /cache/fedora-35.qcow2.snap => (item=cache)

TASK [Get expected mount device based on device type] **************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:6
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.120)       0:00:42.339 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_device_path": "/dev/mapper/foo-test1"
    },
    "changed": false
}

TASK [Set some facts] **********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:10
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.044)       0:00:42.384 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": [],
        "storage_test_mount_expected_match_count": "0",
        "storage_test_mount_point_matches": [],
        "storage_test_swap_expected_matches": "0"
    },
    "changed": false
}

TASK [Verify the current mount state by device] ********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:20
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.056)       0:00:42.440 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the current mount state by mount point] ***************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:29
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.022)       0:00:42.463 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the mount fs type] ************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:37
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.049)       0:00:42.512 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [command] *****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:46
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.036)       0:00:42.548 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Gather swap info] ********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:50
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.025)       0:00:42.574 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify swap status] ******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:55
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.025)       0:00:42.599 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Unset facts] *************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:65
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.025)       0:00:42.625 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": null,
        "storage_test_mount_expected_match_count": null,
        "storage_test_mount_point_matches": null,
        "storage_test_swap_expected_matches": null,
        "storage_test_swaps": null,
        "storage_test_sys_node": null
    },
    "changed": false
}

TASK [Set some variables for fstab checking] ***********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml:2
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.035)       0:00:42.661 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": "0",
        "storage_test_fstab_expected_mount_options_matches": "1",
        "storage_test_fstab_expected_mount_point_matches": "0",
        "storage_test_fstab_id_matches": [],
        "storage_test_fstab_mount_options_matches": [],
        "storage_test_fstab_mount_point_matches": []
    },
    "changed": false
}

TASK [Verify that the device identifier appears in /etc/fstab] *****************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml:12
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.063)       0:00:42.724 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the fstab mount point] ********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml:19
Wednesday 06 July 2022  14:54:02 +0000 (0:00:00.051)       0:00:42.776 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify mount_options] ****************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml:25
Wednesday 06 July 2022  14:54:03 +0000 (0:00:00.052)       0:00:42.828 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up variables] ******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml:34
Wednesday 06 July 2022  14:54:03 +0000 (0:00:00.037)       0:00:42.866 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": null,
        "storage_test_fstab_expected_mount_options_matches": null,
        "storage_test_fstab_expected_mount_point_matches": null,
        "storage_test_fstab_id_matches": null,
        "storage_test_fstab_mount_options_matches": null,
        "storage_test_fstab_mount_point_matches": null
    },
    "changed": false
}

TASK [Verify fs type] **********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fs.yml:4
Wednesday 06 July 2022  14:54:03 +0000 (0:00:00.036)       0:00:42.903 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify fs label] *********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fs.yml:10
Wednesday 06 July 2022  14:54:03 +0000 (0:00:00.042)       0:00:42.945 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [See whether the device node is present] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:4
Wednesday 06 July 2022  14:54:03 +0000 (0:00:00.039)       0:00:42.984 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1657119217.3117163,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "ctime": 1657119215.3157163,
        "dev": 5,
        "device_type": 64768,
        "executable": false,
        "exists": true,
        "gid": 6,
        "gr_name": "disk",
        "inode": 505,
        "isblk": true,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": false,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/symlink",
        "mode": "0660",
        "mtime": 1657119215.3157163,
        "nlink": 1,
        "path": "/dev/mapper/foo-test1",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": null,
        "wgrp": true,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [Verify the presence/absence of the device node] **************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:10
Wednesday 06 July 2022  14:54:03 +0000 (0:00:00.400)       0:00:43.385 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about this volume] *********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:15
Wednesday 06 July 2022  14:54:03 +0000 (0:00:00.040)       0:00:43.425 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [(1/2) Process volume type (set initial value)] ***************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:21
Wednesday 06 July 2022  14:54:03 +0000 (0:00:00.040)       0:00:43.465 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "st_volume_type": "lvm"
    },
    "changed": false
}

TASK [(2/2) Process volume type (get RAID value)] ******************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:25
Wednesday 06 July 2022  14:54:03 +0000 (0:00:00.036)       0:00:43.502 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the volume's device type] *****************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:30
Wednesday 06 July 2022  14:54:03 +0000 (0:00:00.025)       0:00:43.528 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Stat the LUKS device, if encrypted] **************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:3
Wednesday 06 July 2022  14:54:03 +0000 (0:00:00.038)       0:00:43.567 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Ensure cryptsetup is present] ********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:10
Wednesday 06 July 2022  14:54:03 +0000 (0:00:00.026)       0:00:43.593 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [Collect LUKS info for this volume] ***************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:15
Wednesday 06 July 2022  14:54:05 +0000 (0:00:02.070)       0:00:45.663 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the presence/absence of the LUKS device node] *********************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:21
Wednesday 06 July 2022  14:54:05 +0000 (0:00:00.026)       0:00:45.689 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify that the raw device is the same as the device if not encrypted] ***
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:27
Wednesday 06 July 2022  14:54:05 +0000 (0:00:00.024)       0:00:45.714 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about the LUKS volume if encrypted] ****************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:33
Wednesday 06 July 2022  14:54:05 +0000 (0:00:00.051)       0:00:45.765 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the LUKS volume's device type if encrypted] ***********************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:39
Wednesday 06 July 2022  14:54:06 +0000 (0:00:00.025)       0:00:45.791 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS version] ******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:44
Wednesday 06 July 2022  14:54:06 +0000 (0:00:00.025)       0:00:45.817 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS key size] *****************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:50
Wednesday 06 July 2022  14:54:06 +0000 (0:00:00.023)       0:00:45.841 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS cipher] *******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:56
Wednesday 06 July 2022  14:54:06 +0000 (0:00:00.023)       0:00:45.864 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:62
Wednesday 06 July 2022  14:54:06 +0000 (0:00:00.026)       0:00:45.890 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": [],
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:67
Wednesday 06 July 2022  14:54:06 +0000 (0:00:00.051)       0:00:45.942 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:72
Wednesday 06 July 2022  14:54:06 +0000 (0:00:00.051)       0:00:45.993 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:78
Wednesday 06 July 2022  14:54:06 +0000 (0:00:00.039)       0:00:46.033 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:84
Wednesday 06 July 2022  14:54:06 +0000 (0:00:00.040)       0:00:46.074 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:90
Wednesday 06 July 2022  14:54:06 +0000 (0:00:00.037)       0:00:46.111 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_expected_crypttab_entries": null,
        "_storage_test_expected_crypttab_key_file": null
    },
    "changed": false
}

TASK [get information about RAID] **********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:7
Wednesday 06 July 2022  14:54:06 +0000 (0:00:00.034)       0:00:46.146 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:13
Wednesday 06 July 2022  14:54:06 +0000 (0:00:00.034)       0:00:46.180 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:17
Wednesday 06 July 2022  14:54:06 +0000 (0:00:00.042)       0:00:46.223 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:21
Wednesday 06 July 2022  14:54:06 +0000 (0:00:00.035)       0:00:46.258 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:25
Wednesday 06 July 2022  14:54:06 +0000 (0:00:00.037)       0:00:46.296 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:31
Wednesday 06 July 2022  14:54:06 +0000 (0:00:00.037)       0:00:46.333 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:37
Wednesday 06 July 2022  14:54:06 +0000 (0:00:00.043)       0:00:46.376 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the actual size of the volume] *************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:3
Wednesday 06 July 2022  14:54:06 +0000 (0:00:00.037)       0:00:46.414 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "bytes": 3221225472,
    "changed": false,
    "lvm": "3g",
    "parted": "3GiB",
    "size": "3 GiB"
}

TASK [parse the requested size of the volume] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:9
Wednesday 06 July 2022  14:54:07 +0000 (0:00:00.395)       0:00:46.809 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "bytes": 3221225472,
    "changed": false,
    "lvm": "3g",
    "parted": "3GiB",
    "size": "3 GiB"
}

TASK [Establish base value for expected size] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:15
Wednesday 06 July 2022  14:54:07 +0000 (0:00:00.419)       0:00:47.229 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_expected_size": "3221225472"
    },
    "changed": false
}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:20
Wednesday 06 July 2022  14:54:07 +0000 (0:00:00.089)       0:00:47.318 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_expected_size": "3221225472"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:25
Wednesday 06 July 2022  14:54:07 +0000 (0:00:00.037)       0:00:47.355 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:28
Wednesday 06 July 2022  14:54:07 +0000 (0:00:00.040)       0:00:47.396 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Get the size of parent/pool device] **************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:31
Wednesday 06 July 2022  14:54:07 +0000 (0:00:00.038)       0:00:47.434 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:36
Wednesday 06 July 2022  14:54:07 +0000 (0:00:00.040)       0:00:47.474 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Calculate the expected size based on pool size and percentage value] *****
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:39
Wednesday 06 July 2022  14:54:07 +0000 (0:00:00.037)       0:00:47.512 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:44
Wednesday 06 July 2022  14:54:07 +0000 (0:00:00.038)       0:00:47.550 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_actual_size": {
        "bytes": 3221225472,
        "changed": false,
        "failed": false,
        "lvm": "3g",
        "parted": "3GiB",
        "size": "3 GiB"
    }
}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:47
Wednesday 06 July 2022  14:54:07 +0000 (0:00:00.033)       0:00:47.584 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_expected_size": "3221225472"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:50
Wednesday 06 July 2022  14:54:07 +0000 (0:00:00.065)       0:00:47.650 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Get information about the LV] ********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:6
Wednesday 06 July 2022  14:54:07 +0000 (0:00:00.051)       0:00:47.701 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "lvs",
        "--noheadings",
        "--nameprefixes",
        "--units=b",
        "--nosuffix",
        "--unquoted",
        "-o",
        "name,attr,cache_total_blocks,chunk_size,segtype",
        "foo/test1"
    ],
    "delta": "0:00:00.043711",
    "end": "2022-07-06 14:54:07.060476",
    "rc": 0,
    "start": "2022-07-06 14:54:07.016765"
}

STDOUT:

  LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-a----- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:14
Wednesday 06 July 2022  14:54:08 +0000 (0:00:00.438)       0:00:48.140 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_lv_segtype": [
            "linear"
        ]
    },
    "changed": false
}

TASK [check segment type] ******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:17
Wednesday 06 July 2022  14:54:08 +0000 (0:00:00.106)       0:00:48.247 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:22
Wednesday 06 July 2022  14:54:08 +0000 (0:00:00.052)       0:00:48.299 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested cache size] ******************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:26
Wednesday 06 July 2022  14:54:08 +0000 (0:00:00.041)       0:00:48.341 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:32
Wednesday 06 July 2022  14:54:08 +0000 (0:00:00.039)       0:00:48.381 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check cache size] ********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:36
Wednesday 06 July 2022  14:54:08 +0000 (0:00:00.042)       0:00:48.424 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up facts] **********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume.yml:16
Wednesday 06 July 2022  14:54:08 +0000 (0:00:00.039)       0:00:48.463 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": null
    },
    "changed": false
}

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:40
Wednesday 06 July 2022  14:54:08 +0000 (0:00:00.036)       0:00:48.500 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_pool": null
    },
    "changed": false
}

TASK [Verify the volumes with no pool were correctly managed] ******************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:47
Wednesday 06 July 2022  14:54:08 +0000 (0:00:00.033)       0:00:48.533 ******** 

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:57
Wednesday 06 July 2022  14:54:08 +0000 (0:00:00.021)       0:00:48.555 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_blkinfo": null,
        "storage_test_crypttab": null,
        "storage_test_fstab": null,
        "storage_test_volume": null
    },
    "changed": false
}

TASK [Repeat the previous invocation to verify idempotence] ********************
task path: /tmp/tmpus9dv81c/tests/storage/tests_remove_mount.yml:55
Wednesday 06 July 2022  14:54:08 +0000 (0:00:00.035)       0:00:48.590 ******** 

TASK [fedora.linux_system_roles.storage : set platform/version specific variables] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2
Wednesday 06 July 2022  14:54:08 +0000 (0:00:00.054)       0:00:48.644 ******** 
included: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/fedora-35.qcow2.snap

TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2
Wednesday 06 July 2022  14:54:08 +0000 (0:00:00.037)       0:00:48.682 ******** 
ok: [/cache/fedora-35.qcow2.snap]

TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7
Wednesday 06 July 2022  14:54:09 +0000 (0:00:00.554)       0:00:49.237 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
ok: [/cache/fedora-35.qcow2.snap] => (item=Fedora.yml) => {
    "ansible_facts": {
        "_storage_copr_packages": [
            {
                "packages": [
                    "vdo",
                    "kmod-vdo"
                ],
                "repository": "rhawalsh/dm-vdo"
            }
        ],
        "_storage_copr_support_packages": [
            "dnf-plugins-core"
        ],
        "blivet_package_list": [
            "python3-blivet",
            "libblockdev-crypto",
            "libblockdev-dm",
            "libblockdev-lvm",
            "libblockdev-mdraid",
            "libblockdev-swap"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora.yml"
}
skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora_35.yml",
    "skip_reason": "Conditional result was False"
}
skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora_35.yml",
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5
Wednesday 06 July 2022  14:54:09 +0000 (0:00:00.065)       0:00:49.302 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9
Wednesday 06 July 2022  14:54:09 +0000 (0:00:00.036)       0:00:49.339 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13
Wednesday 06 July 2022  14:54:09 +0000 (0:00:00.036)       0:00:49.376 ******** 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
included: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/fedora-35.qcow2.snap

TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2
Wednesday 06 July 2022  14:54:09 +0000 (0:00:00.090)       0:00:49.466 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : make sure blivet is available] *******
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7
Wednesday 06 July 2022  14:54:09 +0000 (0:00:00.030)       0:00:49.497 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : show storage_pools] ******************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13
Wednesday 06 July 2022  14:54:09 +0000 (0:00:00.035)       0:00:49.532 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_pools": [
        {
            "disks": [
                "sda"
            ],
            "name": "foo",
            "volumes": [
                {
                    "mount_point": "",
                    "name": "test1",
                    "size": "3g"
                }
            ]
        }
    ]
}

TASK [fedora.linux_system_roles.storage : show storage_volumes] ****************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18
Wednesday 06 July 2022  14:54:09 +0000 (0:00:00.044)       0:00:49.576 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined"
}

TASK [fedora.linux_system_roles.storage : get required packages] ***************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23
Wednesday 06 July 2022  14:54:09 +0000 (0:00:00.041)       0:00:49.618 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35
Wednesday 06 July 2022  14:54:09 +0000 (0:00:00.032)       0:00:49.651 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : make sure required packages are installed] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41
Wednesday 06 July 2022  14:54:09 +0000 (0:00:00.030)       0:00:49.681 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : get service facts] *******************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47
Wednesday 06 July 2022  14:54:09 +0000 (0:00:00.030)       0:00:49.712 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] *****
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53
Wednesday 06 July 2022  14:54:09 +0000 (0:00:00.035)       0:00:49.748 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_cryptsetup_services": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58
Wednesday 06 July 2022  14:54:10 +0000 (0:00:00.061)       0:00:49.810 ******** 

TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64
Wednesday 06 July 2022  14:54:10 +0000 (0:00:00.024)       0:00:49.834 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "actions": [],
    "changed": false,
    "crypts": [],
    "leaves": [
        "/dev/sr0",
        "/dev/vda1",
        "/dev/vda2",
        "/dev/vda3",
        "/dev/vda4",
        "/dev/vda5",
        "/dev/mapper/foo-test1",
        "/dev/sdb",
        "/dev/sdc",
        "/dev/nvme0n1",
        "/dev/nvme1n1",
        "/dev/nvme2n1",
        "/dev/vdb1",
        "/dev/vdc1",
        "/dev/vdd",
        "/dev/vde",
        "/dev/vdf",
        "/dev/zram0"
    ],
    "mounts": [],
    "packages": [
        "btrfs-progs",
        "e2fsprogs",
        "xfsprogs",
        "dosfstools",
        "lvm2"
    ],
    "pools": [
        {
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "name": "foo",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "state": "present",
            "type": "lvm",
            "volumes": [
                {
                    "_device": "/dev/mapper/foo-test1",
                    "_kernel_device": "/dev/dm-0",
                    "_mount_id": "/dev/mapper/foo-test1",
                    "_raw_device": "/dev/mapper/foo-test1",
                    "_raw_kernel_device": "/dev/dm-0",
                    "cache_devices": [],
                    "cache_mode": null,
                    "cache_size": 0,
                    "cached": false,
                    "compression": null,
                    "deduplication": null,
                    "disks": [
                        "sda"
                    ],
                    "encryption": false,
                    "encryption_cipher": null,
                    "encryption_key": null,
                    "encryption_key_size": null,
                    "encryption_luks_version": null,
                    "encryption_password": null,
                    "fs_create_options": "",
                    "fs_label": "",
                    "fs_overwrite_existing": true,
                    "fs_type": "xfs",
                    "mount_check": 0,
                    "mount_device_identifier": "uuid",
                    "mount_options": "defaults",
                    "mount_passno": 0,
                    "mount_point": "",
                    "name": "test1",
                    "raid_chunk_size": null,
                    "raid_device_count": null,
                    "raid_disks": [],
                    "raid_level": null,
                    "raid_metadata_version": null,
                    "raid_spare_count": null,
                    "size": "3g",
                    "state": "present",
                    "thin": false,
                    "thin_pool_name": null,
                    "thin_pool_size": null,
                    "type": "lvm",
                    "vdo_pool_size": null
                }
            ]
        }
    ],
    "volumes": []
}

TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78
Wednesday 06 July 2022  14:54:12 +0000 (0:00:01.951)       0:00:51.786 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90
Wednesday 06 July 2022  14:54:12 +0000 (0:00:00.040)       0:00:51.826 ******** 

TASK [fedora.linux_system_roles.storage : show blivet_output] ******************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96
Wednesday 06 July 2022  14:54:12 +0000 (0:00:00.024)       0:00:51.850 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "blivet_output": {
        "actions": [],
        "changed": false,
        "crypts": [],
        "failed": false,
        "leaves": [
            "/dev/sr0",
            "/dev/vda1",
            "/dev/vda2",
            "/dev/vda3",
            "/dev/vda4",
            "/dev/vda5",
            "/dev/mapper/foo-test1",
            "/dev/sdb",
            "/dev/sdc",
            "/dev/nvme0n1",
            "/dev/nvme1n1",
            "/dev/nvme2n1",
            "/dev/vdb1",
            "/dev/vdc1",
            "/dev/vdd",
            "/dev/vde",
            "/dev/vdf",
            "/dev/zram0"
        ],
        "mounts": [],
        "packages": [
            "btrfs-progs",
            "e2fsprogs",
            "xfsprogs",
            "dosfstools",
            "lvm2"
        ],
        "pools": [
            {
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "name": "foo",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "state": "present",
                "type": "lvm",
                "volumes": [
                    {
                        "_device": "/dev/mapper/foo-test1",
                        "_kernel_device": "/dev/dm-0",
                        "_mount_id": "/dev/mapper/foo-test1",
                        "_raw_device": "/dev/mapper/foo-test1",
                        "_raw_kernel_device": "/dev/dm-0",
                        "cache_devices": [],
                        "cache_mode": null,
                        "cache_size": 0,
                        "cached": false,
                        "compression": null,
                        "deduplication": null,
                        "disks": [
                            "sda"
                        ],
                        "encryption": false,
                        "encryption_cipher": null,
                        "encryption_key": null,
                        "encryption_key_size": null,
                        "encryption_luks_version": null,
                        "encryption_password": null,
                        "fs_create_options": "",
                        "fs_label": "",
                        "fs_overwrite_existing": true,
                        "fs_type": "xfs",
                        "mount_check": 0,
                        "mount_device_identifier": "uuid",
                        "mount_options": "defaults",
                        "mount_passno": 0,
                        "mount_point": "",
                        "name": "test1",
                        "raid_chunk_size": null,
                        "raid_device_count": null,
                        "raid_disks": [],
                        "raid_level": null,
                        "raid_metadata_version": null,
                        "raid_spare_count": null,
                        "size": "3g",
                        "state": "present",
                        "thin": false,
                        "thin_pool_name": null,
                        "thin_pool_size": null,
                        "type": "lvm",
                        "vdo_pool_size": null
                    }
                ]
            }
        ],
        "volumes": []
    }
}

TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101
Wednesday 06 July 2022  14:54:12 +0000 (0:00:00.043)       0:00:51.894 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": [
            {
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "name": "foo",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "state": "present",
                "type": "lvm",
                "volumes": [
                    {
                        "_device": "/dev/mapper/foo-test1",
                        "_kernel_device": "/dev/dm-0",
                        "_mount_id": "/dev/mapper/foo-test1",
                        "_raw_device": "/dev/mapper/foo-test1",
                        "_raw_kernel_device": "/dev/dm-0",
                        "cache_devices": [],
                        "cache_mode": null,
                        "cache_size": 0,
                        "cached": false,
                        "compression": null,
                        "deduplication": null,
                        "disks": [
                            "sda"
                        ],
                        "encryption": false,
                        "encryption_cipher": null,
                        "encryption_key": null,
                        "encryption_key_size": null,
                        "encryption_luks_version": null,
                        "encryption_password": null,
                        "fs_create_options": "",
                        "fs_label": "",
                        "fs_overwrite_existing": true,
                        "fs_type": "xfs",
                        "mount_check": 0,
                        "mount_device_identifier": "uuid",
                        "mount_options": "defaults",
                        "mount_passno": 0,
                        "mount_point": "",
                        "name": "test1",
                        "raid_chunk_size": null,
                        "raid_device_count": null,
                        "raid_disks": [],
                        "raid_level": null,
                        "raid_metadata_version": null,
                        "raid_spare_count": null,
                        "size": "3g",
                        "state": "present",
                        "thin": false,
                        "thin_pool_name": null,
                        "thin_pool_size": null,
                        "type": "lvm",
                        "vdo_pool_size": null
                    }
                ]
            }
        ]
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105
Wednesday 06 July 2022  14:54:12 +0000 (0:00:00.045)       0:00:51.939 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : remove obsolete mounts] **************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121
Wednesday 06 July 2022  14:54:12 +0000 (0:00:00.041)       0:00:51.980 ******** 

TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132
Wednesday 06 July 2022  14:54:12 +0000 (0:00:00.036)       0:00:52.017 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : set up new/current mounts] ***********
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137
Wednesday 06 July 2022  14:54:12 +0000 (0:00:00.026)       0:00:52.044 ******** 

TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148
Wednesday 06 July 2022  14:54:12 +0000 (0:00:00.041)       0:00:52.085 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156
Wednesday 06 July 2022  14:54:12 +0000 (0:00:00.024)       0:00:52.110 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1657093385.4860332,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709",
        "ctime": 1657005647.423,
        "dev": 31,
        "device_type": 0,
        "executable": false,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 267,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/x-empty",
        "mode": "0600",
        "mtime": 1657005500.596,
        "nlink": 1,
        "path": "/etc/crypttab",
        "pw_name": "root",
        "readable": true,
        "rgrp": false,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": "10",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161
Wednesday 06 July 2022  14:54:12 +0000 (0:00:00.413)       0:00:52.524 ******** 

TASK [fedora.linux_system_roles.storage : Update facts] ************************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183
Wednesday 06 July 2022  14:54:12 +0000 (0:00:00.025)       0:00:52.549 ******** 
ok: [/cache/fedora-35.qcow2.snap]
META: role_complete for /cache/fedora-35.qcow2.snap

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/tests_remove_mount.yml:67
Wednesday 06 July 2022  14:54:13 +0000 (0:00:00.928)       0:00:53.478 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml for /cache/fedora-35.qcow2.snap

TASK [Print out pool information] **********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:1
Wednesday 06 July 2022  14:54:13 +0000 (0:00:00.049)       0:00:53.527 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "_storage_pools_list": [
        {
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "name": "foo",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "state": "present",
            "type": "lvm",
            "volumes": [
                {
                    "_device": "/dev/mapper/foo-test1",
                    "_kernel_device": "/dev/dm-0",
                    "_mount_id": "/dev/mapper/foo-test1",
                    "_raw_device": "/dev/mapper/foo-test1",
                    "_raw_kernel_device": "/dev/dm-0",
                    "cache_devices": [],
                    "cache_mode": null,
                    "cache_size": 0,
                    "cached": false,
                    "compression": null,
                    "deduplication": null,
                    "disks": [
                        "sda"
                    ],
                    "encryption": false,
                    "encryption_cipher": null,
                    "encryption_key": null,
                    "encryption_key_size": null,
                    "encryption_luks_version": null,
                    "encryption_password": null,
                    "fs_create_options": "",
                    "fs_label": "",
                    "fs_overwrite_existing": true,
                    "fs_type": "xfs",
                    "mount_check": 0,
                    "mount_device_identifier": "uuid",
                    "mount_options": "defaults",
                    "mount_passno": 0,
                    "mount_point": "",
                    "name": "test1",
                    "raid_chunk_size": null,
                    "raid_device_count": null,
                    "raid_disks": [],
                    "raid_level": null,
                    "raid_metadata_version": null,
                    "raid_spare_count": null,
                    "size": "3g",
                    "state": "present",
                    "thin": false,
                    "thin_pool_name": null,
                    "thin_pool_size": null,
                    "type": "lvm",
                    "vdo_pool_size": null
                }
            ]
        }
    ]
}

TASK [Print out volume information] ********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:6
Wednesday 06 July 2022  14:54:13 +0000 (0:00:00.053)       0:00:53.581 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Collect info about the volumes.] *****************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:14
Wednesday 06 July 2022  14:54:13 +0000 (0:00:00.036)       0:00:53.618 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "info": {
        "/dev/mapper/foo-test1": {
            "fstype": "xfs",
            "label": "",
            "name": "/dev/mapper/foo-test1",
            "size": "3G",
            "type": "lvm",
            "uuid": "1c41a782-c302-4977-99a1-bf5ce9244c3a"
        },
        "/dev/nvme0n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme0n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme1n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme1n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme2n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme2n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sda": {
            "fstype": "LVM2_member",
            "label": "",
            "name": "/dev/sda",
            "size": "10G",
            "type": "disk",
            "uuid": "t9n4nJ-EghS-hyi4-sfCo-76en-eXSi-MMptMD"
        },
        "/dev/sdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdb",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdc",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sr0": {
            "fstype": "iso9660",
            "label": "cidata",
            "name": "/dev/sr0",
            "size": "364K",
            "type": "rom",
            "uuid": "2022-07-06-14-53-07-00"
        },
        "/dev/vda": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda",
            "size": "4G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vda1": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda1",
            "size": "1M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda2": {
            "fstype": "ext4",
            "label": "boot",
            "name": "/dev/vda2",
            "size": "500M",
            "type": "partition",
            "uuid": "5f2f82d0-ae0a-4574-8811-62a31a51a870"
        },
        "/dev/vda3": {
            "fstype": "vfat",
            "label": "",
            "name": "/dev/vda3",
            "size": "100M",
            "type": "partition",
            "uuid": "5B84-6DD7"
        },
        "/dev/vda4": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda4",
            "size": "4M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda5": {
            "fstype": "btrfs",
            "label": "fedora",
            "name": "/dev/vda5",
            "size": "3.4G",
            "type": "partition",
            "uuid": "fbdaf05f-1a41-4dc5-b56e-a10edb430f9a"
        },
        "/dev/vdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdb",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdb1": {
            "fstype": "ext4",
            "label": "yumcache",
            "name": "/dev/vdb1",
            "size": "2G",
            "type": "partition",
            "uuid": "e676dfc5-3e4b-4331-8ede-73c3f56d2cab"
        },
        "/dev/vdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdc",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdc1": {
            "fstype": "ext4",
            "label": "yumvarlib",
            "name": "/dev/vdc1",
            "size": "2G",
            "type": "partition",
            "uuid": "0c299eb4-81f5-4414-b246-b95738eb82f0"
        },
        "/dev/vdd": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdd",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vde": {
            "fstype": "",
            "label": "",
            "name": "/dev/vde",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdf": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdf",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/zram0": {
            "fstype": "",
            "label": "",
            "name": "/dev/zram0",
            "size": "1.9G",
            "type": "disk",
            "uuid": ""
        }
    }
}

TASK [Read the /etc/fstab file for volume existence] ***************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:19
Wednesday 06 July 2022  14:54:14 +0000 (0:00:00.413)       0:00:54.031 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/fstab"
    ],
    "delta": "0:00:00.003103",
    "end": "2022-07-06 14:54:13.345744",
    "rc": 0,
    "start": "2022-07-06 14:54:13.342641"
}

STDOUT:


#
# /etc/fstab
# Created by anaconda on Tue Jul  5 07:18:20 2022
#
# Accessible filesystems, by reference, are maintained under '/dev/disk/'.
# See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info.
#
# After editing this file, run 'systemctl daemon-reload' to update systemd
# units generated from this file.
#
UUID=fbdaf05f-1a41-4dc5-b56e-a10edb430f9a /                       btrfs   subvol=root,compress=zstd:1 0 0
UUID=5f2f82d0-ae0a-4574-8811-62a31a51a870 /boot                   ext4    defaults        1 2
UUID=5B84-6DD7          /boot/efi               vfat    defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2
UUID=fbdaf05f-1a41-4dc5-b56e-a10edb430f9a /home                   btrfs   subvol=home,compress=zstd:1 0 0
/dev/vdb1	/var/cache/dnf	auto	defaults,nofail,comment=cloudconfig	0	2
/dev/vdc1	/var/lib/dnf	auto	defaults,nofail,comment=cloudconfig	0	2

TASK [Read the /etc/crypttab file] *********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:24
Wednesday 06 July 2022  14:54:14 +0000 (0:00:00.394)       0:00:54.426 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/crypttab"
    ],
    "delta": "0:00:00.002998",
    "end": "2022-07-06 14:54:13.745587",
    "failed_when_result": false,
    "rc": 0,
    "start": "2022-07-06 14:54:13.742589"
}

TASK [Verify the volumes listed in storage_pools were correctly managed] *******
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:33
Wednesday 06 July 2022  14:54:15 +0000 (0:00:00.398)       0:00:54.825 ******** 
[WARNING]: The loop variable 'storage_test_pool' is already in use. You should
set the `loop_var` value in the `loop_control` option for the task to something
else to avoid variable collisions and unexpected behavior.
included: /tmp/tmpus9dv81c/tests/storage/test-verify-pool.yml for /cache/fedora-35.qcow2.snap => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}], 'raid_chunk_size': None})

TASK [Set _storage_pool_tests] *************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool.yml:5
Wednesday 06 July 2022  14:54:15 +0000 (0:00:00.061)       0:00:54.886 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pool_tests": [
            "members",
            "volumes"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool.yml:18
Wednesday 06 July 2022  14:54:15 +0000 (0:00:00.073)       0:00:54.959 ******** 
included: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml for /cache/fedora-35.qcow2.snap => (item=members)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-volumes.yml for /cache/fedora-35.qcow2.snap => (item=volumes)

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:1
Wednesday 06 July 2022  14:54:15 +0000 (0:00:00.049)       0:00:55.008 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_count": "1",
        "_storage_test_pool_pvs_lvm": [
            "/dev/sda"
        ]
    },
    "changed": false
}

TASK [Get the canonical device path for each member device] ********************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:6
Wednesday 06 July 2022  14:54:15 +0000 (0:00:00.119)       0:00:55.128 ******** 
ok: [/cache/fedora-35.qcow2.snap] => (item=/dev/sda) => {
    "ansible_loop_var": "pv",
    "changed": false,
    "device": "/dev/sda",
    "pv": "/dev/sda"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:15
Wednesday 06 July 2022  14:54:15 +0000 (0:00:00.406)       0:00:55.535 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "__pvs_lvm_len": "1"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:19
Wednesday 06 July 2022  14:54:15 +0000 (0:00:00.058)       0:00:55.593 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_pool_pvs": [
            "/dev/sda"
        ]
    },
    "changed": false
}

TASK [Verify PV count] *********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:23
Wednesday 06 July 2022  14:54:15 +0000 (0:00:00.057)       0:00:55.651 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:29
Wednesday 06 July 2022  14:54:15 +0000 (0:00:00.057)       0:00:55.708 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_type": "disk"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:33
Wednesday 06 July 2022  14:54:15 +0000 (0:00:00.042)       0:00:55.750 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_type": "disk"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:37
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.050)       0:00:55.801 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check the type of each PV] ***********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:41
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.024)       0:00:55.826 ******** 
ok: [/cache/fedora-35.qcow2.snap] => (item=/dev/sda) => {
    "ansible_loop_var": "pv",
    "changed": false,
    "pv": "/dev/sda"
}

MSG:

All assertions passed

TASK [Check MD RAID] ***********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:50
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.046)       0:00:55.872 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml for /cache/fedora-35.qcow2.snap

TASK [get information about RAID] **********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:6
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.045)       0:00:55.918 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:12
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.025)       0:00:55.943 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:16
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.025)       0:00:55.969 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:20
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.026)       0:00:55.995 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:24
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.027)       0:00:56.022 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:30
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.027)       0:00:56.050 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:36
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.026)       0:00:56.076 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:44
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.027)       0:00:56.104 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_md_active_devices_re": null,
        "storage_test_md_metadata_version_re": null,
        "storage_test_md_spare_devices_re": null
    },
    "changed": false
}

TASK [Check LVM RAID] **********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:53
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.036)       0:00:56.141 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-lvmraid.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member LVM RAID settings] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-lvmraid.yml:1
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.047)       0:00:56.188 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-lvmraid.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [Get information about LVM RAID] ******************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-lvmraid.yml:3
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.045)       0:00:56.233 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is LVM RAID] *******************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-lvmraid.yml:8
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.031)       0:00:56.264 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-lvmraid.yml:12
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.030)       0:00:56.294 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check Thin Pools] ********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:56
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.030)       0:00:56.325 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-thin.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member thinpool settings] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-thin.yml:1
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.046)       0:00:56.372 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-thin.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [Get information about thinpool] ******************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-thin.yml:3
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.045)       0:00:56.417 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is in correct thinpool (when thinp name is provided)] ***
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-thin.yml:8
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.023)       0:00:56.441 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is in thinpool (when thinp name is not provided)] ******
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-thin.yml:13
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.023)       0:00:56.464 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-thin.yml:17
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.023)       0:00:56.488 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check member encryption] *************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:59
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.023)       0:00:56.511 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-encryption.yml for /cache/fedora-35.qcow2.snap

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-encryption.yml:4
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.046)       0:00:56.558 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Validate pool member LUKS settings] **************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-encryption.yml:8
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.104)       0:00:56.662 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => (item=/dev/sda)  => {
    "_storage_test_pool_member_path": "/dev/sda",
    "ansible_loop_var": "_storage_test_pool_member_path",
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Validate pool member crypttab entries] ***********************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-encryption.yml:15
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.029)       0:00:56.691 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-crypttab.yml for /cache/fedora-35.qcow2.snap => (item=/dev/sda)

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-crypttab.yml:1
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.042)       0:00:56.734 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": []
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-crypttab.yml:4
Wednesday 06 July 2022  14:54:16 +0000 (0:00:00.047)       0:00:56.781 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-crypttab.yml:9
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.049)       0:00:56.831 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-crypttab.yml:15
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.037)       0:00:56.869 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-crypttab.yml:21
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.041)       0:00:56.910 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-crypttab.yml:27
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.041)       0:00:56.952 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-encryption.yml:22
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.038)       0:00:56.991 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_crypttab_key_file": null
    },
    "changed": false
}

TASK [Check VDO] ***************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:62
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.035)       0:00:57.027 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-vdo.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member VDO settings] ***************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-vdo.yml:1
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.049)       0:00:57.076 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [get information about VDO deduplication] *********************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:3
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.045)       0:00:57.122 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:8
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.024)       0:00:57.147 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:11
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.026)       0:00:57.173 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:16
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.026)       0:00:57.199 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:21
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.036)       0:00:57.236 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:24
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.031)       0:00:57.267 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:29
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.027)       0:00:57.294 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:39
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.024)       0:00:57.319 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_vdo_status": null
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:65
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.036)       0:00:57.356 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "__pvs_lvm_len": null,
        "_storage_test_expected_pv_count": null,
        "_storage_test_expected_pv_type": null,
        "_storage_test_pool_pvs": [],
        "_storage_test_pool_pvs_lvm": []
    },
    "changed": false
}

TASK [verify the volumes] ******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-volumes.yml:3
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.036)       0:00:57.392 ******** 
[WARNING]: The loop variable 'storage_test_volume' is already in use. You
should set the `loop_var` value in the `loop_control` option for the task to
something else to avoid variable collisions and unexpected behavior.
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'})

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume.yml:2
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.046)       0:00:57.439 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": true,
        "_storage_volume_tests": [
            "mount",
            "fstab",
            "fs",
            "device",
            "encryption",
            "md",
            "size",
            "cache"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume.yml:10
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.052)       0:00:57.491 ******** 
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml for /cache/fedora-35.qcow2.snap => (item=mount)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml for /cache/fedora-35.qcow2.snap => (item=fstab)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fs.yml for /cache/fedora-35.qcow2.snap => (item=fs)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml for /cache/fedora-35.qcow2.snap => (item=device)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml for /cache/fedora-35.qcow2.snap => (item=encryption)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml for /cache/fedora-35.qcow2.snap => (item=md)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml for /cache/fedora-35.qcow2.snap => (item=size)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml for /cache/fedora-35.qcow2.snap => (item=cache)

TASK [Get expected mount device based on device type] **************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:6
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.078)       0:00:57.570 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_device_path": "/dev/mapper/foo-test1"
    },
    "changed": false
}

TASK [Set some facts] **********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:10
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.042)       0:00:57.612 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": [],
        "storage_test_mount_expected_match_count": "0",
        "storage_test_mount_point_matches": [],
        "storage_test_swap_expected_matches": "0"
    },
    "changed": false
}

TASK [Verify the current mount state by device] ********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:20
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.055)       0:00:57.668 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the current mount state by mount point] ***************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:29
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.024)       0:00:57.692 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the mount fs type] ************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:37
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.046)       0:00:57.739 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [command] *****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:46
Wednesday 06 July 2022  14:54:17 +0000 (0:00:00.035)       0:00:57.774 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Gather swap info] ********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:50
Wednesday 06 July 2022  14:54:18 +0000 (0:00:00.023)       0:00:57.797 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify swap status] ******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:55
Wednesday 06 July 2022  14:54:18 +0000 (0:00:00.024)       0:00:57.822 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Unset facts] *************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:65
Wednesday 06 July 2022  14:54:18 +0000 (0:00:00.024)       0:00:57.847 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": null,
        "storage_test_mount_expected_match_count": null,
        "storage_test_mount_point_matches": null,
        "storage_test_swap_expected_matches": null,
        "storage_test_swaps": null,
        "storage_test_sys_node": null
    },
    "changed": false
}

TASK [Set some variables for fstab checking] ***********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml:2
Wednesday 06 July 2022  14:54:18 +0000 (0:00:00.083)       0:00:57.931 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": "0",
        "storage_test_fstab_expected_mount_options_matches": "1",
        "storage_test_fstab_expected_mount_point_matches": "0",
        "storage_test_fstab_id_matches": [],
        "storage_test_fstab_mount_options_matches": [],
        "storage_test_fstab_mount_point_matches": []
    },
    "changed": false
}

TASK [Verify that the device identifier appears in /etc/fstab] *****************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml:12
Wednesday 06 July 2022  14:54:18 +0000 (0:00:00.146)       0:00:58.077 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the fstab mount point] ********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml:19
Wednesday 06 July 2022  14:54:18 +0000 (0:00:00.052)       0:00:58.129 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify mount_options] ****************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml:25
Wednesday 06 July 2022  14:54:18 +0000 (0:00:00.051)       0:00:58.181 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up variables] ******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml:34
Wednesday 06 July 2022  14:54:18 +0000 (0:00:00.037)       0:00:58.218 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": null,
        "storage_test_fstab_expected_mount_options_matches": null,
        "storage_test_fstab_expected_mount_point_matches": null,
        "storage_test_fstab_id_matches": null,
        "storage_test_fstab_mount_options_matches": null,
        "storage_test_fstab_mount_point_matches": null
    },
    "changed": false
}

TASK [Verify fs type] **********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fs.yml:4
Wednesday 06 July 2022  14:54:18 +0000 (0:00:00.031)       0:00:58.250 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify fs label] *********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fs.yml:10
Wednesday 06 July 2022  14:54:18 +0000 (0:00:00.036)       0:00:58.287 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [See whether the device node is present] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:4
Wednesday 06 July 2022  14:54:18 +0000 (0:00:00.038)       0:00:58.325 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1657119217.3117163,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "ctime": 1657119215.3157163,
        "dev": 5,
        "device_type": 64768,
        "executable": false,
        "exists": true,
        "gid": 6,
        "gr_name": "disk",
        "inode": 505,
        "isblk": true,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": false,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/symlink",
        "mode": "0660",
        "mtime": 1657119215.3157163,
        "nlink": 1,
        "path": "/dev/mapper/foo-test1",
        "pw_name": "root",
        "readable": true,
        "rgrp": true,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": null,
        "wgrp": true,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [Verify the presence/absence of the device node] **************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:10
Wednesday 06 July 2022  14:54:18 +0000 (0:00:00.401)       0:00:58.727 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about this volume] *********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:15
Wednesday 06 July 2022  14:54:18 +0000 (0:00:00.039)       0:00:58.767 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [(1/2) Process volume type (set initial value)] ***************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:21
Wednesday 06 July 2022  14:54:19 +0000 (0:00:00.039)       0:00:58.807 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "st_volume_type": "lvm"
    },
    "changed": false
}

TASK [(2/2) Process volume type (get RAID value)] ******************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:25
Wednesday 06 July 2022  14:54:19 +0000 (0:00:00.038)       0:00:58.846 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the volume's device type] *****************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:30
Wednesday 06 July 2022  14:54:19 +0000 (0:00:00.026)       0:00:58.872 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Stat the LUKS device, if encrypted] **************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:3
Wednesday 06 July 2022  14:54:19 +0000 (0:00:00.037)       0:00:58.910 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Ensure cryptsetup is present] ********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:10
Wednesday 06 July 2022  14:54:19 +0000 (0:00:00.023)       0:00:58.933 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [Collect LUKS info for this volume] ***************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:15
Wednesday 06 July 2022  14:54:21 +0000 (0:00:01.985)       0:01:00.919 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the presence/absence of the LUKS device node] *********************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:21
Wednesday 06 July 2022  14:54:21 +0000 (0:00:00.031)       0:01:00.950 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify that the raw device is the same as the device if not encrypted] ***
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:27
Wednesday 06 July 2022  14:54:21 +0000 (0:00:00.031)       0:01:00.982 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about the LUKS volume if encrypted] ****************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:33
Wednesday 06 July 2022  14:54:21 +0000 (0:00:00.055)       0:01:01.038 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the LUKS volume's device type if encrypted] ***********************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:39
Wednesday 06 July 2022  14:54:21 +0000 (0:00:00.027)       0:01:01.065 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS version] ******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:44
Wednesday 06 July 2022  14:54:21 +0000 (0:00:00.029)       0:01:01.094 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS key size] *****************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:50
Wednesday 06 July 2022  14:54:21 +0000 (0:00:00.030)       0:01:01.125 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS cipher] *******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:56
Wednesday 06 July 2022  14:54:21 +0000 (0:00:00.026)       0:01:01.151 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:62
Wednesday 06 July 2022  14:54:21 +0000 (0:00:00.029)       0:01:01.180 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": [],
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:67
Wednesday 06 July 2022  14:54:21 +0000 (0:00:00.053)       0:01:01.234 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:72
Wednesday 06 July 2022  14:54:21 +0000 (0:00:00.052)       0:01:01.287 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:78
Wednesday 06 July 2022  14:54:21 +0000 (0:00:00.038)       0:01:01.325 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:84
Wednesday 06 July 2022  14:54:21 +0000 (0:00:00.051)       0:01:01.377 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:90
Wednesday 06 July 2022  14:54:21 +0000 (0:00:00.043)       0:01:01.421 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_expected_crypttab_entries": null,
        "_storage_test_expected_crypttab_key_file": null
    },
    "changed": false
}

TASK [get information about RAID] **********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:7
Wednesday 06 July 2022  14:54:21 +0000 (0:00:00.038)       0:01:01.459 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:13
Wednesday 06 July 2022  14:54:21 +0000 (0:00:00.037)       0:01:01.497 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:17
Wednesday 06 July 2022  14:54:21 +0000 (0:00:00.035)       0:01:01.533 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:21
Wednesday 06 July 2022  14:54:21 +0000 (0:00:00.039)       0:01:01.572 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:25
Wednesday 06 July 2022  14:54:21 +0000 (0:00:00.042)       0:01:01.615 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:31
Wednesday 06 July 2022  14:54:21 +0000 (0:00:00.038)       0:01:01.653 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:37
Wednesday 06 July 2022  14:54:21 +0000 (0:00:00.039)       0:01:01.693 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the actual size of the volume] *************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:3
Wednesday 06 July 2022  14:54:21 +0000 (0:00:00.038)       0:01:01.731 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "bytes": 3221225472,
    "changed": false,
    "lvm": "3g",
    "parted": "3GiB",
    "size": "3 GiB"
}

TASK [parse the requested size of the volume] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:9
Wednesday 06 July 2022  14:54:22 +0000 (0:00:00.411)       0:01:02.143 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "bytes": 3221225472,
    "changed": false,
    "lvm": "3g",
    "parted": "3GiB",
    "size": "3 GiB"
}

TASK [Establish base value for expected size] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:15
Wednesday 06 July 2022  14:54:22 +0000 (0:00:00.451)       0:01:02.594 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_expected_size": "3221225472"
    },
    "changed": false
}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:20
Wednesday 06 July 2022  14:54:22 +0000 (0:00:00.086)       0:01:02.680 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_expected_size": "3221225472"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:25
Wednesday 06 July 2022  14:54:22 +0000 (0:00:00.035)       0:01:02.715 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:28
Wednesday 06 July 2022  14:54:22 +0000 (0:00:00.037)       0:01:02.753 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Get the size of parent/pool device] **************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:31
Wednesday 06 July 2022  14:54:23 +0000 (0:00:00.039)       0:01:02.792 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:36
Wednesday 06 July 2022  14:54:23 +0000 (0:00:00.037)       0:01:02.830 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Calculate the expected size based on pool size and percentage value] *****
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:39
Wednesday 06 July 2022  14:54:23 +0000 (0:00:00.042)       0:01:02.872 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:44
Wednesday 06 July 2022  14:54:23 +0000 (0:00:00.039)       0:01:02.911 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_actual_size": {
        "bytes": 3221225472,
        "changed": false,
        "failed": false,
        "lvm": "3g",
        "parted": "3GiB",
        "size": "3 GiB"
    }
}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:47
Wednesday 06 July 2022  14:54:23 +0000 (0:00:00.039)       0:01:02.951 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_expected_size": "3221225472"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:50
Wednesday 06 July 2022  14:54:23 +0000 (0:00:00.034)       0:01:02.985 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Get information about the LV] ********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:6
Wednesday 06 July 2022  14:54:23 +0000 (0:00:00.051)       0:01:03.036 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "lvs",
        "--noheadings",
        "--nameprefixes",
        "--units=b",
        "--nosuffix",
        "--unquoted",
        "-o",
        "name,attr,cache_total_blocks,chunk_size,segtype",
        "foo/test1"
    ],
    "delta": "0:00:00.041649",
    "end": "2022-07-06 14:54:22.401487",
    "rc": 0,
    "start": "2022-07-06 14:54:22.359838"
}

STDOUT:

  LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-a----- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:14
Wednesday 06 July 2022  14:54:23 +0000 (0:00:00.445)       0:01:03.481 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_lv_segtype": [
            "linear"
        ]
    },
    "changed": false
}

TASK [check segment type] ******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:17
Wednesday 06 July 2022  14:54:23 +0000 (0:00:00.052)       0:01:03.534 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:22
Wednesday 06 July 2022  14:54:23 +0000 (0:00:00.053)       0:01:03.588 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested cache size] ******************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:26
Wednesday 06 July 2022  14:54:23 +0000 (0:00:00.042)       0:01:03.631 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:32
Wednesday 06 July 2022  14:54:23 +0000 (0:00:00.040)       0:01:03.671 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check cache size] ********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:36
Wednesday 06 July 2022  14:54:23 +0000 (0:00:00.037)       0:01:03.708 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up facts] **********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume.yml:16
Wednesday 06 July 2022  14:54:23 +0000 (0:00:00.037)       0:01:03.746 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": null
    },
    "changed": false
}

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:40
Wednesday 06 July 2022  14:54:24 +0000 (0:00:00.039)       0:01:03.785 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_pool": null
    },
    "changed": false
}

TASK [Verify the volumes with no pool were correctly managed] ******************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:47
Wednesday 06 July 2022  14:54:24 +0000 (0:00:00.035)       0:01:03.820 ******** 

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:57
Wednesday 06 July 2022  14:54:24 +0000 (0:00:00.021)       0:01:03.842 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_blkinfo": null,
        "storage_test_crypttab": null,
        "storage_test_fstab": null,
        "storage_test_volume": null
    },
    "changed": false
}

TASK [Clean up] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/tests_remove_mount.yml:69
Wednesday 06 July 2022  14:54:24 +0000 (0:00:00.035)       0:01:03.878 ******** 

TASK [fedora.linux_system_roles.storage : set platform/version specific variables] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2
Wednesday 06 July 2022  14:54:24 +0000 (0:00:00.060)       0:01:03.938 ******** 
included: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for /cache/fedora-35.qcow2.snap

TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2
Wednesday 06 July 2022  14:54:24 +0000 (0:00:00.037)       0:01:03.975 ******** 
ok: [/cache/fedora-35.qcow2.snap]

TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7
Wednesday 06 July 2022  14:54:24 +0000 (0:00:00.545)       0:01:04.521 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => (item=RedHat.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "RedHat.yml",
    "skip_reason": "Conditional result was False"
}
ok: [/cache/fedora-35.qcow2.snap] => (item=Fedora.yml) => {
    "ansible_facts": {
        "_storage_copr_packages": [
            {
                "packages": [
                    "vdo",
                    "kmod-vdo"
                ],
                "repository": "rhawalsh/dm-vdo"
            }
        ],
        "_storage_copr_support_packages": [
            "dnf-plugins-core"
        ],
        "blivet_package_list": [
            "python3-blivet",
            "libblockdev-crypto",
            "libblockdev-dm",
            "libblockdev-lvm",
            "libblockdev-mdraid",
            "libblockdev-swap"
        ]
    },
    "ansible_included_var_files": [
        "/tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/vars/Fedora.yml"
    ],
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora.yml"
}
skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora_35.yml",
    "skip_reason": "Conditional result was False"
}
skipping: [/cache/fedora-35.qcow2.snap] => (item=Fedora_35.yml)  => {
    "ansible_loop_var": "item",
    "changed": false,
    "item": "Fedora_35.yml",
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : define an empty list of pools to be used in testing] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5
Wednesday 06 July 2022  14:54:24 +0000 (0:00:00.114)       0:01:04.635 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : define an empty list of volumes to be used in testing] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9
Wednesday 06 July 2022  14:54:24 +0000 (0:00:00.107)       0:01:04.743 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : include the appropriate provider tasks] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13
Wednesday 06 July 2022  14:54:24 +0000 (0:00:00.037)       0:01:04.780 ******** 
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount
included: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for /cache/fedora-35.qcow2.snap

TASK [fedora.linux_system_roles.storage : get a list of rpm packages installed on host machine] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2
Wednesday 06 July 2022  14:54:25 +0000 (0:00:00.059)       0:01:04.839 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : make sure blivet is available] *******
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7
Wednesday 06 July 2022  14:54:25 +0000 (0:00:00.023)       0:01:04.863 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : show storage_pools] ******************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:13
Wednesday 06 July 2022  14:54:25 +0000 (0:00:00.030)       0:01:04.894 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_pools": [
        {
            "disks": [
                "sda"
            ],
            "name": "foo",
            "state": "absent",
            "volumes": [
                {
                    "mount_point": "",
                    "name": "test1",
                    "size": "3g"
                }
            ]
        }
    ]
}

TASK [fedora.linux_system_roles.storage : show storage_volumes] ****************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:18
Wednesday 06 July 2022  14:54:25 +0000 (0:00:00.052)       0:01:04.946 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined"
}

TASK [fedora.linux_system_roles.storage : get required packages] ***************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23
Wednesday 06 July 2022  14:54:25 +0000 (0:00:00.042)       0:01:04.989 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : enable copr repositories if needed] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:35
Wednesday 06 July 2022  14:54:25 +0000 (0:00:00.029)       0:01:05.018 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : make sure required packages are installed] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41
Wednesday 06 July 2022  14:54:25 +0000 (0:00:00.029)       0:01:05.047 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : get service facts] *******************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47
Wednesday 06 July 2022  14:54:25 +0000 (0:00:00.028)       0:01:05.076 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] *****
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:53
Wednesday 06 July 2022  14:54:25 +0000 (0:00:00.028)       0:01:05.104 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_cryptsetup_services": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58
Wednesday 06 July 2022  14:54:25 +0000 (0:00:00.065)       0:01:05.169 ******** 

TASK [fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64
Wednesday 06 July 2022  14:54:25 +0000 (0:00:00.022)       0:01:05.191 ******** 
changed: [/cache/fedora-35.qcow2.snap] => {
    "actions": [
        {
            "action": "destroy format",
            "device": "/dev/mapper/foo-test1",
            "fs_type": "xfs"
        },
        {
            "action": "destroy device",
            "device": "/dev/mapper/foo-test1",
            "fs_type": null
        },
        {
            "action": "destroy device",
            "device": "/dev/foo",
            "fs_type": null
        },
        {
            "action": "destroy format",
            "device": "/dev/sda",
            "fs_type": "lvmpv"
        }
    ],
    "changed": true,
    "crypts": [],
    "leaves": [
        "/dev/sr0",
        "/dev/vda1",
        "/dev/vda2",
        "/dev/vda3",
        "/dev/vda4",
        "/dev/vda5",
        "/dev/sda",
        "/dev/sdb",
        "/dev/sdc",
        "/dev/nvme0n1",
        "/dev/nvme1n1",
        "/dev/nvme2n1",
        "/dev/vdb1",
        "/dev/vdc1",
        "/dev/vdd",
        "/dev/vde",
        "/dev/vdf",
        "/dev/zram0"
    ],
    "mounts": [],
    "packages": [
        "dosfstools",
        "btrfs-progs",
        "e2fsprogs"
    ],
    "pools": [
        {
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "name": "foo",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "state": "absent",
            "type": "lvm",
            "volumes": [
                {
                    "_device": "/dev/mapper/foo-test1",
                    "_mount_id": "/dev/mapper/foo-test1",
                    "_raw_device": "/dev/mapper/foo-test1",
                    "cache_devices": [],
                    "cache_mode": null,
                    "cache_size": 0,
                    "cached": false,
                    "compression": null,
                    "deduplication": null,
                    "disks": [
                        "sda"
                    ],
                    "encryption": false,
                    "encryption_cipher": null,
                    "encryption_key": null,
                    "encryption_key_size": null,
                    "encryption_luks_version": null,
                    "encryption_password": null,
                    "fs_create_options": "",
                    "fs_label": "",
                    "fs_overwrite_existing": true,
                    "fs_type": "xfs",
                    "mount_check": 0,
                    "mount_device_identifier": "uuid",
                    "mount_options": "defaults",
                    "mount_passno": 0,
                    "mount_point": "",
                    "name": "test1",
                    "raid_chunk_size": null,
                    "raid_device_count": null,
                    "raid_disks": [],
                    "raid_level": null,
                    "raid_metadata_version": null,
                    "raid_spare_count": null,
                    "size": "3g",
                    "state": "present",
                    "thin": false,
                    "thin_pool_name": null,
                    "thin_pool_size": null,
                    "type": "lvm",
                    "vdo_pool_size": null
                }
            ]
        }
    ],
    "volumes": []
}

TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:78
Wednesday 06 July 2022  14:54:27 +0000 (0:00:02.476)       0:01:07.668 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90
Wednesday 06 July 2022  14:54:27 +0000 (0:00:00.040)       0:01:07.708 ******** 

TASK [fedora.linux_system_roles.storage : show blivet_output] ******************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:96
Wednesday 06 July 2022  14:54:27 +0000 (0:00:00.021)       0:01:07.730 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "blivet_output": {
        "actions": [
            {
                "action": "destroy format",
                "device": "/dev/mapper/foo-test1",
                "fs_type": "xfs"
            },
            {
                "action": "destroy device",
                "device": "/dev/mapper/foo-test1",
                "fs_type": null
            },
            {
                "action": "destroy device",
                "device": "/dev/foo",
                "fs_type": null
            },
            {
                "action": "destroy format",
                "device": "/dev/sda",
                "fs_type": "lvmpv"
            }
        ],
        "changed": true,
        "crypts": [],
        "failed": false,
        "leaves": [
            "/dev/sr0",
            "/dev/vda1",
            "/dev/vda2",
            "/dev/vda3",
            "/dev/vda4",
            "/dev/vda5",
            "/dev/sda",
            "/dev/sdb",
            "/dev/sdc",
            "/dev/nvme0n1",
            "/dev/nvme1n1",
            "/dev/nvme2n1",
            "/dev/vdb1",
            "/dev/vdc1",
            "/dev/vdd",
            "/dev/vde",
            "/dev/vdf",
            "/dev/zram0"
        ],
        "mounts": [],
        "packages": [
            "dosfstools",
            "btrfs-progs",
            "e2fsprogs"
        ],
        "pools": [
            {
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "name": "foo",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "state": "absent",
                "type": "lvm",
                "volumes": [
                    {
                        "_device": "/dev/mapper/foo-test1",
                        "_mount_id": "/dev/mapper/foo-test1",
                        "_raw_device": "/dev/mapper/foo-test1",
                        "cache_devices": [],
                        "cache_mode": null,
                        "cache_size": 0,
                        "cached": false,
                        "compression": null,
                        "deduplication": null,
                        "disks": [
                            "sda"
                        ],
                        "encryption": false,
                        "encryption_cipher": null,
                        "encryption_key": null,
                        "encryption_key_size": null,
                        "encryption_luks_version": null,
                        "encryption_password": null,
                        "fs_create_options": "",
                        "fs_label": "",
                        "fs_overwrite_existing": true,
                        "fs_type": "xfs",
                        "mount_check": 0,
                        "mount_device_identifier": "uuid",
                        "mount_options": "defaults",
                        "mount_passno": 0,
                        "mount_point": "",
                        "name": "test1",
                        "raid_chunk_size": null,
                        "raid_device_count": null,
                        "raid_disks": [],
                        "raid_level": null,
                        "raid_metadata_version": null,
                        "raid_spare_count": null,
                        "size": "3g",
                        "state": "present",
                        "thin": false,
                        "thin_pool_name": null,
                        "thin_pool_size": null,
                        "type": "lvm",
                        "vdo_pool_size": null
                    }
                ]
            }
        ],
        "volumes": []
    }
}

TASK [fedora.linux_system_roles.storage : set the list of pools for test verification] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:101
Wednesday 06 July 2022  14:54:27 +0000 (0:00:00.042)       0:01:07.772 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pools_list": [
            {
                "disks": [
                    "sda"
                ],
                "encryption": false,
                "encryption_cipher": null,
                "encryption_key": null,
                "encryption_key_size": null,
                "encryption_luks_version": null,
                "encryption_password": null,
                "name": "foo",
                "raid_chunk_size": null,
                "raid_device_count": null,
                "raid_level": null,
                "raid_metadata_version": null,
                "raid_spare_count": null,
                "state": "absent",
                "type": "lvm",
                "volumes": [
                    {
                        "_device": "/dev/mapper/foo-test1",
                        "_mount_id": "/dev/mapper/foo-test1",
                        "_raw_device": "/dev/mapper/foo-test1",
                        "cache_devices": [],
                        "cache_mode": null,
                        "cache_size": 0,
                        "cached": false,
                        "compression": null,
                        "deduplication": null,
                        "disks": [
                            "sda"
                        ],
                        "encryption": false,
                        "encryption_cipher": null,
                        "encryption_key": null,
                        "encryption_key_size": null,
                        "encryption_luks_version": null,
                        "encryption_password": null,
                        "fs_create_options": "",
                        "fs_label": "",
                        "fs_overwrite_existing": true,
                        "fs_type": "xfs",
                        "mount_check": 0,
                        "mount_device_identifier": "uuid",
                        "mount_options": "defaults",
                        "mount_passno": 0,
                        "mount_point": "",
                        "name": "test1",
                        "raid_chunk_size": null,
                        "raid_device_count": null,
                        "raid_disks": [],
                        "raid_level": null,
                        "raid_metadata_version": null,
                        "raid_spare_count": null,
                        "size": "3g",
                        "state": "present",
                        "thin": false,
                        "thin_pool_name": null,
                        "thin_pool_size": null,
                        "type": "lvm",
                        "vdo_pool_size": null
                    }
                ]
            }
        ]
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : set the list of volumes for test verification] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:105
Wednesday 06 July 2022  14:54:28 +0000 (0:00:00.044)       0:01:07.816 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [fedora.linux_system_roles.storage : remove obsolete mounts] **************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121
Wednesday 06 July 2022  14:54:28 +0000 (0:00:00.039)       0:01:07.856 ******** 

TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132
Wednesday 06 July 2022  14:54:28 +0000 (0:00:00.037)       0:01:07.894 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : set up new/current mounts] ***********
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:137
Wednesday 06 July 2022  14:54:28 +0000 (0:00:00.024)       0:01:07.918 ******** 

TASK [fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148
Wednesday 06 July 2022  14:54:28 +0000 (0:00:00.036)       0:01:07.954 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [fedora.linux_system_roles.storage : retrieve facts for the /etc/crypttab file] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:156
Wednesday 06 July 2022  14:54:28 +0000 (0:00:00.024)       0:01:07.978 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "stat": {
        "atime": 1657093385.4860332,
        "attr_flags": "",
        "attributes": [],
        "block_size": 4096,
        "blocks": 0,
        "charset": "binary",
        "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709",
        "ctime": 1657005647.423,
        "dev": 31,
        "device_type": 0,
        "executable": false,
        "exists": true,
        "gid": 0,
        "gr_name": "root",
        "inode": 267,
        "isblk": false,
        "ischr": false,
        "isdir": false,
        "isfifo": false,
        "isgid": false,
        "islnk": false,
        "isreg": true,
        "issock": false,
        "isuid": false,
        "mimetype": "inode/x-empty",
        "mode": "0600",
        "mtime": 1657005500.596,
        "nlink": 1,
        "path": "/etc/crypttab",
        "pw_name": "root",
        "readable": true,
        "rgrp": false,
        "roth": false,
        "rusr": true,
        "size": 0,
        "uid": 0,
        "version": "10",
        "wgrp": false,
        "woth": false,
        "writeable": true,
        "wusr": true,
        "xgrp": false,
        "xoth": false,
        "xusr": false
    }
}

TASK [fedora.linux_system_roles.storage : manage /etc/crypttab to account for changes we just made] ***
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161
Wednesday 06 July 2022  14:54:28 +0000 (0:00:00.436)       0:01:08.414 ******** 

TASK [fedora.linux_system_roles.storage : Update facts] ************************
task path: /tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183
Wednesday 06 July 2022  14:54:28 +0000 (0:00:00.056)       0:01:08.470 ******** 
ok: [/cache/fedora-35.qcow2.snap]
META: role_complete for /cache/fedora-35.qcow2.snap

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/tests_remove_mount.yml:82
Wednesday 06 July 2022  14:54:29 +0000 (0:00:00.992)       0:01:09.463 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml for /cache/fedora-35.qcow2.snap

TASK [Print out pool information] **********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:1
Wednesday 06 July 2022  14:54:29 +0000 (0:00:00.049)       0:01:09.513 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "_storage_pools_list": [
        {
            "disks": [
                "sda"
            ],
            "encryption": false,
            "encryption_cipher": null,
            "encryption_key": null,
            "encryption_key_size": null,
            "encryption_luks_version": null,
            "encryption_password": null,
            "name": "foo",
            "raid_chunk_size": null,
            "raid_device_count": null,
            "raid_level": null,
            "raid_metadata_version": null,
            "raid_spare_count": null,
            "state": "absent",
            "type": "lvm",
            "volumes": [
                {
                    "_device": "/dev/mapper/foo-test1",
                    "_mount_id": "/dev/mapper/foo-test1",
                    "_raw_device": "/dev/mapper/foo-test1",
                    "cache_devices": [],
                    "cache_mode": null,
                    "cache_size": 0,
                    "cached": false,
                    "compression": null,
                    "deduplication": null,
                    "disks": [
                        "sda"
                    ],
                    "encryption": false,
                    "encryption_cipher": null,
                    "encryption_key": null,
                    "encryption_key_size": null,
                    "encryption_luks_version": null,
                    "encryption_password": null,
                    "fs_create_options": "",
                    "fs_label": "",
                    "fs_overwrite_existing": true,
                    "fs_type": "xfs",
                    "mount_check": 0,
                    "mount_device_identifier": "uuid",
                    "mount_options": "defaults",
                    "mount_passno": 0,
                    "mount_point": "",
                    "name": "test1",
                    "raid_chunk_size": null,
                    "raid_device_count": null,
                    "raid_disks": [],
                    "raid_level": null,
                    "raid_metadata_version": null,
                    "raid_spare_count": null,
                    "size": "3g",
                    "state": "present",
                    "thin": false,
                    "thin_pool_name": null,
                    "thin_pool_size": null,
                    "type": "lvm",
                    "vdo_pool_size": null
                }
            ]
        }
    ]
}

TASK [Print out volume information] ********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:6
Wednesday 06 July 2022  14:54:29 +0000 (0:00:00.052)       0:01:09.566 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Collect info about the volumes.] *****************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:14
Wednesday 06 July 2022  14:54:29 +0000 (0:00:00.038)       0:01:09.604 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "info": {
        "/dev/nvme0n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme0n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme1n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme1n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/nvme2n1": {
            "fstype": "",
            "label": "",
            "name": "/dev/nvme2n1",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sda": {
            "fstype": "",
            "label": "",
            "name": "/dev/sda",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdb",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/sdc",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/sr0": {
            "fstype": "iso9660",
            "label": "cidata",
            "name": "/dev/sr0",
            "size": "364K",
            "type": "rom",
            "uuid": "2022-07-06-14-53-07-00"
        },
        "/dev/vda": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda",
            "size": "4G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vda1": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda1",
            "size": "1M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda2": {
            "fstype": "ext4",
            "label": "boot",
            "name": "/dev/vda2",
            "size": "500M",
            "type": "partition",
            "uuid": "5f2f82d0-ae0a-4574-8811-62a31a51a870"
        },
        "/dev/vda3": {
            "fstype": "vfat",
            "label": "",
            "name": "/dev/vda3",
            "size": "100M",
            "type": "partition",
            "uuid": "5B84-6DD7"
        },
        "/dev/vda4": {
            "fstype": "",
            "label": "",
            "name": "/dev/vda4",
            "size": "4M",
            "type": "partition",
            "uuid": ""
        },
        "/dev/vda5": {
            "fstype": "btrfs",
            "label": "fedora",
            "name": "/dev/vda5",
            "size": "3.4G",
            "type": "partition",
            "uuid": "fbdaf05f-1a41-4dc5-b56e-a10edb430f9a"
        },
        "/dev/vdb": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdb",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdb1": {
            "fstype": "ext4",
            "label": "yumcache",
            "name": "/dev/vdb1",
            "size": "2G",
            "type": "partition",
            "uuid": "e676dfc5-3e4b-4331-8ede-73c3f56d2cab"
        },
        "/dev/vdc": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdc",
            "size": "2G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdc1": {
            "fstype": "ext4",
            "label": "yumvarlib",
            "name": "/dev/vdc1",
            "size": "2G",
            "type": "partition",
            "uuid": "0c299eb4-81f5-4414-b246-b95738eb82f0"
        },
        "/dev/vdd": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdd",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vde": {
            "fstype": "",
            "label": "",
            "name": "/dev/vde",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/vdf": {
            "fstype": "",
            "label": "",
            "name": "/dev/vdf",
            "size": "10G",
            "type": "disk",
            "uuid": ""
        },
        "/dev/zram0": {
            "fstype": "",
            "label": "",
            "name": "/dev/zram0",
            "size": "1.9G",
            "type": "disk",
            "uuid": ""
        }
    }
}

TASK [Read the /etc/fstab file for volume existence] ***************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:19
Wednesday 06 July 2022  14:54:30 +0000 (0:00:00.421)       0:01:10.026 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/fstab"
    ],
    "delta": "0:00:00.002998",
    "end": "2022-07-06 14:54:29.340800",
    "rc": 0,
    "start": "2022-07-06 14:54:29.337802"
}

STDOUT:


#
# /etc/fstab
# Created by anaconda on Tue Jul  5 07:18:20 2022
#
# Accessible filesystems, by reference, are maintained under '/dev/disk/'.
# See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info.
#
# After editing this file, run 'systemctl daemon-reload' to update systemd
# units generated from this file.
#
UUID=fbdaf05f-1a41-4dc5-b56e-a10edb430f9a /                       btrfs   subvol=root,compress=zstd:1 0 0
UUID=5f2f82d0-ae0a-4574-8811-62a31a51a870 /boot                   ext4    defaults        1 2
UUID=5B84-6DD7          /boot/efi               vfat    defaults,uid=0,gid=0,umask=077,shortname=winnt 0 2
UUID=fbdaf05f-1a41-4dc5-b56e-a10edb430f9a /home                   btrfs   subvol=home,compress=zstd:1 0 0
/dev/vdb1	/var/cache/dnf	auto	defaults,nofail,comment=cloudconfig	0	2
/dev/vdc1	/var/lib/dnf	auto	defaults,nofail,comment=cloudconfig	0	2

TASK [Read the /etc/crypttab file] *********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:24
Wednesday 06 July 2022  14:54:30 +0000 (0:00:00.394)       0:01:10.420 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "cmd": [
        "cat",
        "/etc/crypttab"
    ],
    "delta": "0:00:00.003084",
    "end": "2022-07-06 14:54:29.735725",
    "failed_when_result": false,
    "rc": 0,
    "start": "2022-07-06 14:54:29.732641"
}

TASK [Verify the volumes listed in storage_pools were correctly managed] *******
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:33
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.393)       0:01:10.814 ******** 
[WARNING]: The loop variable 'storage_test_pool' is already in use. You should
set the `loop_var` value in the `loop_control` option for the task to something
else to avoid variable collisions and unexpected behavior.
included: /tmp/tmpus9dv81c/tests/storage/test-verify-pool.yml for /cache/fedora-35.qcow2.snap => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'state': 'absent', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1'}], 'raid_chunk_size': None})

TASK [Set _storage_pool_tests] *************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool.yml:5
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.062)       0:01:10.876 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_pool_tests": [
            "members",
            "volumes"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool.yml:18
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.037)       0:01:10.914 ******** 
included: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml for /cache/fedora-35.qcow2.snap => (item=members)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-volumes.yml for /cache/fedora-35.qcow2.snap => (item=volumes)

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:1
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.046)       0:01:10.961 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_count": "0",
        "_storage_test_pool_pvs_lvm": []
    },
    "changed": false
}

TASK [Get the canonical device path for each member device] ********************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:6
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.053)       0:01:11.015 ******** 

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:15
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.022)       0:01:11.037 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "__pvs_lvm_len": "0"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:19
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.083)       0:01:11.121 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_pool_pvs": []
    },
    "changed": false
}

TASK [Verify PV count] *********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:23
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.052)       0:01:11.173 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:29
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.083)       0:01:11.256 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_type": "disk"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:33
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.074)       0:01:11.331 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_pv_type": "disk"
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:37
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.081)       0:01:11.413 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check the type of each PV] ***********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:41
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.031)       0:01:11.444 ******** 

TASK [Check MD RAID] ***********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:50
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.026)       0:01:11.471 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml for /cache/fedora-35.qcow2.snap

TASK [get information about RAID] **********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:6
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.052)       0:01:11.524 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:12
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.037)       0:01:11.562 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:16
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.034)       0:01:11.596 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:20
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.033)       0:01:11.629 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:24
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.034)       0:01:11.664 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:30
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.025)       0:01:11.689 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:36
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.023)       0:01:11.712 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-md.yml:44
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.023)       0:01:11.736 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_md_active_devices_re": null,
        "storage_test_md_metadata_version_re": null,
        "storage_test_md_spare_devices_re": null
    },
    "changed": false
}

TASK [Check LVM RAID] **********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:53
Wednesday 06 July 2022  14:54:31 +0000 (0:00:00.031)       0:01:11.768 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-lvmraid.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member LVM RAID settings] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-lvmraid.yml:1
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.044)       0:01:11.812 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-lvmraid.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1'})

TASK [Get information about LVM RAID] ******************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-lvmraid.yml:3
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.048)       0:01:11.860 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is LVM RAID] *******************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-lvmraid.yml:8
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.032)       0:01:11.893 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-lvmraid.yml:12
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.033)       0:01:11.927 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check Thin Pools] ********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:56
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.032)       0:01:11.959 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-thin.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member thinpool settings] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-thin.yml:1
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.047)       0:01:12.007 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-thin.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1'})

TASK [Get information about thinpool] ******************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-thin.yml:3
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.046)       0:01:12.053 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is in correct thinpool (when thinp name is provided)] ***
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-thin.yml:8
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.023)       0:01:12.077 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check that volume is in thinpool (when thinp name is not provided)] ******
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-thin.yml:13
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.023)       0:01:12.100 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-thin.yml:17
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.024)       0:01:12.125 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check member encryption] *************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:59
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.024)       0:01:12.150 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-encryption.yml for /cache/fedora-35.qcow2.snap

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-encryption.yml:4
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.049)       0:01:12.199 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Validate pool member LUKS settings] **************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-encryption.yml:8
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.057)       0:01:12.257 ******** 

TASK [Validate pool member crypttab entries] ***********************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-encryption.yml:15
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.023)       0:01:12.281 ******** 

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-encryption.yml:22
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.023)       0:01:12.304 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_crypttab_key_file": null
    },
    "changed": false
}

TASK [Check VDO] ***************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:62
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.034)       0:01:12.339 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-vdo.yml for /cache/fedora-35.qcow2.snap

TASK [Validate pool member VDO settings] ***************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-members-vdo.yml:1
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.049)       0:01:12.388 ******** 
included: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1'})

TASK [get information about VDO deduplication] *********************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:3
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.045)       0:01:12.434 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:8
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.026)       0:01:12.460 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:11
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.025)       0:01:12.486 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:16
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.023)       0:01:12.510 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:21
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.026)       0:01:12.537 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:24
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.064)       0:01:12.601 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:29
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.026)       0:01:12.628 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-pool-member-vdo.yml:39
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.026)       0:01:12.655 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_vdo_status": null
    },
    "changed": false
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-members.yml:65
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.038)       0:01:12.694 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "__pvs_lvm_len": null,
        "_storage_test_expected_pv_count": null,
        "_storage_test_expected_pv_type": null,
        "_storage_test_pool_pvs": [],
        "_storage_test_pool_pvs_lvm": []
    },
    "changed": false
}

TASK [verify the volumes] ******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-pool-volumes.yml:3
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.039)       0:01:12.733 ******** 
[WARNING]: The loop variable 'storage_test_volume' is already in use. You
should set the `loop_var` value in the `loop_control` option for the task to
something else to avoid variable collisions and unexpected behavior.
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume.yml for /cache/fedora-35.qcow2.snap => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '', 'name': 'test1', 'raid_level': None, 'size': '3g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1'})

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume.yml:2
Wednesday 06 July 2022  14:54:32 +0000 (0:00:00.044)       0:01:12.778 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": false,
        "_storage_volume_tests": [
            "mount",
            "fstab",
            "fs",
            "device",
            "encryption",
            "md",
            "size",
            "cache"
        ]
    },
    "changed": false
}

TASK [include_tasks] ***********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume.yml:10
Wednesday 06 July 2022  14:54:33 +0000 (0:00:00.052)       0:01:12.831 ******** 
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml for /cache/fedora-35.qcow2.snap => (item=mount)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml for /cache/fedora-35.qcow2.snap => (item=fstab)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fs.yml for /cache/fedora-35.qcow2.snap => (item=fs)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml for /cache/fedora-35.qcow2.snap => (item=device)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml for /cache/fedora-35.qcow2.snap => (item=encryption)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml for /cache/fedora-35.qcow2.snap => (item=md)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml for /cache/fedora-35.qcow2.snap => (item=size)
included: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml for /cache/fedora-35.qcow2.snap => (item=cache)

TASK [Get expected mount device based on device type] **************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:6
Wednesday 06 July 2022  14:54:33 +0000 (0:00:00.092)       0:01:12.923 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_device_path": "/dev/mapper/foo-test1"
    },
    "changed": false
}

TASK [Set some facts] **********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:10
Wednesday 06 July 2022  14:54:33 +0000 (0:00:00.044)       0:01:12.968 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": [],
        "storage_test_mount_expected_match_count": "0",
        "storage_test_mount_point_matches": [],
        "storage_test_swap_expected_matches": "0"
    },
    "changed": false
}

TASK [Verify the current mount state by device] ********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:20
Wednesday 06 July 2022  14:54:33 +0000 (0:00:00.060)       0:01:13.028 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the current mount state by mount point] ***************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:29
Wednesday 06 July 2022  14:54:33 +0000 (0:00:00.028)       0:01:13.057 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify the mount fs type] ************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:37
Wednesday 06 July 2022  14:54:33 +0000 (0:00:00.053)       0:01:13.111 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [command] *****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:46
Wednesday 06 July 2022  14:54:33 +0000 (0:00:00.039)       0:01:13.150 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Gather swap info] ********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:50
Wednesday 06 July 2022  14:54:33 +0000 (0:00:00.025)       0:01:13.175 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify swap status] ******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:55
Wednesday 06 July 2022  14:54:33 +0000 (0:00:00.023)       0:01:13.198 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Unset facts] *************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-mount.yml:65
Wednesday 06 July 2022  14:54:33 +0000 (0:00:00.024)       0:01:13.223 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_mount_device_matches": null,
        "storage_test_mount_expected_match_count": null,
        "storage_test_mount_point_matches": null,
        "storage_test_swap_expected_matches": null,
        "storage_test_swaps": null,
        "storage_test_sys_node": null
    },
    "changed": false
}

TASK [Set some variables for fstab checking] ***********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml:2
Wednesday 06 July 2022  14:54:33 +0000 (0:00:00.034)       0:01:13.258 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": "0",
        "storage_test_fstab_expected_mount_options_matches": "0",
        "storage_test_fstab_expected_mount_point_matches": "0",
        "storage_test_fstab_id_matches": [],
        "storage_test_fstab_mount_options_matches": [],
        "storage_test_fstab_mount_point_matches": []
    },
    "changed": false
}

TASK [Verify that the device identifier appears in /etc/fstab] *****************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml:12
Wednesday 06 July 2022  14:54:33 +0000 (0:00:00.066)       0:01:13.325 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the fstab mount point] ********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml:19
Wednesday 06 July 2022  14:54:33 +0000 (0:00:00.025)       0:01:13.351 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Verify mount_options] ****************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml:25
Wednesday 06 July 2022  14:54:33 +0000 (0:00:00.052)       0:01:13.403 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up variables] ******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fstab.yml:34
Wednesday 06 July 2022  14:54:33 +0000 (0:00:00.038)       0:01:13.442 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_fstab_expected_id_matches": null,
        "storage_test_fstab_expected_mount_options_matches": null,
        "storage_test_fstab_expected_mount_point_matches": null,
        "storage_test_fstab_id_matches": null,
        "storage_test_fstab_mount_options_matches": null,
        "storage_test_fstab_mount_point_matches": null
    },
    "changed": false
}

TASK [Verify fs type] **********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fs.yml:4
Wednesday 06 July 2022  14:54:33 +0000 (0:00:00.038)       0:01:13.480 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify fs label] *********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-fs.yml:10
Wednesday 06 July 2022  14:54:33 +0000 (0:00:00.026)       0:01:13.507 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [See whether the device node is present] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:4
Wednesday 06 July 2022  14:54:33 +0000 (0:00:00.023)       0:01:13.530 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "stat": {
        "exists": false
    }
}

TASK [Verify the presence/absence of the device node] **************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:10
Wednesday 06 July 2022  14:54:34 +0000 (0:00:00.390)       0:01:13.921 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Make sure we got info about this volume] *********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:15
Wednesday 06 July 2022  14:54:34 +0000 (0:00:00.040)       0:01:13.962 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [(1/2) Process volume type (set initial value)] ***************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:21
Wednesday 06 July 2022  14:54:34 +0000 (0:00:00.028)       0:01:13.991 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "st_volume_type": "lvm"
    },
    "changed": false
}

TASK [(2/2) Process volume type (get RAID value)] ******************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:25
Wednesday 06 July 2022  14:54:34 +0000 (0:00:00.037)       0:01:14.028 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the volume's device type] *****************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-device.yml:30
Wednesday 06 July 2022  14:54:34 +0000 (0:00:00.023)       0:01:14.052 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Stat the LUKS device, if encrypted] **************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:3
Wednesday 06 July 2022  14:54:34 +0000 (0:00:00.024)       0:01:14.076 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Ensure cryptsetup is present] ********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:10
Wednesday 06 July 2022  14:54:34 +0000 (0:00:00.025)       0:01:14.102 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "rc": 0,
    "results": []
}

MSG:

Nothing to do

TASK [Collect LUKS info for this volume] ***************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:15
Wednesday 06 July 2022  14:54:36 +0000 (0:00:02.142)       0:01:16.244 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the presence/absence of the LUKS device node] *********************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:21
Wednesday 06 July 2022  14:54:36 +0000 (0:00:00.026)       0:01:16.271 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify that the raw device is the same as the device if not encrypted] ***
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:27
Wednesday 06 July 2022  14:54:36 +0000 (0:00:00.026)       0:01:16.297 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Make sure we got info about the LUKS volume if encrypted] ****************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:33
Wednesday 06 July 2022  14:54:36 +0000 (0:00:00.024)       0:01:16.322 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Verify the LUKS volume's device type if encrypted] ***********************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:39
Wednesday 06 July 2022  14:54:36 +0000 (0:00:00.026)       0:01:16.348 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS version] ******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:44
Wednesday 06 July 2022  14:54:36 +0000 (0:00:00.025)       0:01:16.374 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS key size] *****************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:50
Wednesday 06 July 2022  14:54:36 +0000 (0:00:00.025)       0:01:16.400 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check LUKS cipher] *******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:56
Wednesday 06 July 2022  14:54:36 +0000 (0:00:00.025)       0:01:16.425 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:62
Wednesday 06 July 2022  14:54:36 +0000 (0:00:00.024)       0:01:16.450 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": [],
        "_storage_test_expected_crypttab_entries": "0",
        "_storage_test_expected_crypttab_key_file": "-"
    },
    "changed": false
}

TASK [Check for /etc/crypttab entry] *******************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:67
Wednesday 06 July 2022  14:54:36 +0000 (0:00:00.048)       0:01:16.498 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "changed": false
}

MSG:

All assertions passed

TASK [Validate the format of the crypttab entry] *******************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:72
Wednesday 06 July 2022  14:54:36 +0000 (0:00:00.050)       0:01:16.549 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check backing device of crypttab entry] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:78
Wednesday 06 July 2022  14:54:36 +0000 (0:00:00.037)       0:01:16.586 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check key file of crypttab entry] ****************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:84
Wednesday 06 July 2022  14:54:36 +0000 (0:00:00.034)       0:01:16.621 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:90
Wednesday 06 July 2022  14:54:36 +0000 (0:00:00.033)       0:01:16.655 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_crypttab_entries": null,
        "_storage_test_expected_crypttab_entries": null,
        "_storage_test_expected_crypttab_key_file": null
    },
    "changed": false
}

TASK [get information about RAID] **********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:7
Wednesday 06 July 2022  14:54:36 +0000 (0:00:00.033)       0:01:16.688 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:13
Wednesday 06 July 2022  14:54:36 +0000 (0:00:00.044)       0:01:16.732 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:17
Wednesday 06 July 2022  14:54:36 +0000 (0:00:00.039)       0:01:16.772 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:21
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.038)       0:01:16.811 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID active devices count] *****************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:25
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.035)       0:01:16.847 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID spare devices count] ******************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:31
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.038)       0:01:16.886 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check RAID metadata version] *********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-md.yml:37
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.038)       0:01:16.924 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the actual size of the volume] *************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:3
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.039)       0:01:16.963 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested size of the volume] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:9
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.026)       0:01:16.990 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Establish base value for expected size] **********************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:15
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.041)       0:01:17.032 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:20
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.045)       0:01:17.077 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_expected_size": "3221225472"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:25
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.042)       0:01:17.120 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:28
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.042)       0:01:17.162 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Get the size of parent/pool device] **************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:31
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.048)       0:01:17.211 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:36
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.045)       0:01:17.256 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {}

TASK [Calculate the expected size based on pool size and percentage value] *****
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:39
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.042)       0:01:17.298 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:44
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.039)       0:01:17.338 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_actual_size": {
        "changed": false,
        "skip_reason": "Conditional result was False",
        "skipped": true
    }
}

TASK [debug] *******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:47
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.040)       0:01:17.379 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "storage_test_expected_size": "3221225472"
}

TASK [assert] ******************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-size.yml:50
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.036)       0:01:17.415 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Get information about the LV] ********************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:6
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.025)       0:01:17.440 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:14
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.026)       0:01:17.467 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [check segment type] ******************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:17
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.025)       0:01:17.492 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:22
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.025)       0:01:17.518 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [parse the requested cache size] ******************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:26
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.026)       0:01:17.544 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [set_fact] ****************************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:32
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.025)       0:01:17.570 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Check cache size] ********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume-cache.yml:36
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.067)       0:01:17.638 ******** 
skipping: [/cache/fedora-35.qcow2.snap] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [Clean up facts] **********************************************************
task path: /tmp/tmpus9dv81c/tests/storage/test-verify-volume.yml:16
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.026)       0:01:17.665 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "_storage_test_volume_present": null
    },
    "changed": false
}

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:40
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.034)       0:01:17.699 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_pool": null
    },
    "changed": false
}

TASK [Verify the volumes with no pool were correctly managed] ******************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:47
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.038)       0:01:17.737 ******** 

TASK [Clean up variable namespace] *********************************************
task path: /tmp/tmpus9dv81c/tests/storage/verify-role-results.yml:57
Wednesday 06 July 2022  14:54:37 +0000 (0:00:00.024)       0:01:17.762 ******** 
ok: [/cache/fedora-35.qcow2.snap] => {
    "ansible_facts": {
        "storage_test_blkinfo": null,
        "storage_test_crypttab": null,
        "storage_test_fstab": null,
        "storage_test_volume": null
    },
    "changed": false
}
META: ran handlers
META: ran handlers

PLAY RECAP *********************************************************************
/cache/fedora-35.qcow2.snap : ok=383  changed=4    unreachable=0    failed=0    skipped=322  rescued=0    ignored=0   

Wednesday 06 July 2022  14:54:38 +0000 (0:00:00.047)       0:01:17.810 ******** 
=============================================================================== 
fedora.linux_system_roles.storage : make sure blivet is available ------- 2.66s
/tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:7 
fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 2.58s
/tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 
fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 2.48s
/tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 
Ensure cryptsetup is present -------------------------------------------- 2.14s
/tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:10 -----------
Ensure cryptsetup is present -------------------------------------------- 2.07s
/tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:10 -----------
Ensure cryptsetup is present -------------------------------------------- 2.06s
/tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:10 -----------
fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 2.03s
/tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 
Ensure cryptsetup is present -------------------------------------------- 1.99s
/tmp/tmpus9dv81c/tests/storage/test-verify-volume-encryption.yml:10 -----------
fedora.linux_system_roles.storage : manage the pools and volumes to match the specified state --- 1.95s
/tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 
fedora.linux_system_roles.storage : get service facts ------------------- 1.89s
/tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:47 
fedora.linux_system_roles.storage : make sure required packages are installed --- 1.88s
/tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:41 
Gathering Facts --------------------------------------------------------- 1.28s
/tmp/tmpus9dv81c/tests/storage/tests_remove_mount.yml:2 -----------------------
fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.99s
/tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 
fedora.linux_system_roles.storage : Update facts ------------------------ 0.99s
/tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 
fedora.linux_system_roles.storage : Update facts ------------------------ 0.96s
/tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 
fedora.linux_system_roles.storage : Update facts ------------------------ 0.93s
/tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 
fedora.linux_system_roles.storage : Update facts ------------------------ 0.92s
/tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 
fedora.linux_system_roles.storage : Update facts ------------------------ 0.92s
/tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:183 
fedora.linux_system_roles.storage : tell systemd to refresh its view of /etc/fstab --- 0.79s
/tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 
fedora.linux_system_roles.storage : get required packages --------------- 0.73s
/tmp/tmpfdufgi2k/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:23