[WARNING]: Collection infra.leapp does not support Ansible version 2.14.18 [WARNING]: running playbook inside collection infra.leapp ansible-playbook [core 2.14.18] config file = /etc/ansible/ansible.cfg configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python3.9/site-packages/ansible ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections executable location = /usr/bin/ansible-playbook python version = 3.9.25 (main, Mar 9 2026, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-14)] (/usr/bin/python3) jinja version = 3.1.2 libyaml = True Using /etc/ansible/ansible.cfg as config file Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_remediations_7to8.yml ****************************************** 1 plays in /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_remediations_7to8.yml PLAY [Test RHEL 7 to 8 remediations] ******************************************* TASK [Gathering Facts] ********************************************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_remediations_7to8.yml:2 ok: [managed-node02] TASK [Include tests_upgrade_custom playbook] *********************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_remediations_7to8.yml:22 included: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/tests_upgrade_custom.yml for managed-node02 TASK [tests_upgrade_custom | Check if leapp upgrade log exists] **************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/tests_upgrade_custom.yml:12 ok: [managed-node02] => {"changed": false, "stat": {"exists": false}} TASK [tests_upgrade_custom | Skip test if already upgraded or not RHEL {{ rhel_base_ver }}] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/tests_upgrade_custom.yml:17 META: end_play conditional evaluated to False, continuing play skipping: [managed-node02] => {"msg": "end_play", "skip_reason": "end_play conditional evaluated to False, continuing play"} TASK [tests_upgrade_custom | Include common upgrade tasks] ********************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/tests_upgrade_custom.yml:27 included: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/common_upgrade_tasks.yml for managed-node02 TASK [common_upgrade_tasks | Remove leapp packages] **************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/common_upgrade_tasks.yml:6 ok: [managed-node02] => {"changed": false, "msg": "", "rc": 0, "results": ["leapp-upgrade is not installed"]} TASK [common_upgrade_tasks | Gather setup tasks] ******************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/common_upgrade_tasks.yml:11 ok: [managed-node02 -> localhost] => {"changed": false, "examined": 4, "files": [{"atime": 1773930825.5808694, "ctime": 1773930825.3988702, "dev": 51716, "gid": 0, "gr_name": "root", "inode": 746586270, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773930825.3988702, "nlink": 1, "path": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_cifs.yml", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 272, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1773930825.5808694, "ctime": 1773930825.3988702, "dev": 51716, "gid": 0, "gr_name": "root", "inode": 746586271, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773930825.3988702, "nlink": 1, "path": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_remote_using_root.yml", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 268, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1773930825.5808694, "ctime": 1773930825.3988702, "dev": 51716, "gid": 0, "gr_name": "root", "inode": 746586272, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773930825.3988702, "nlink": 1, "path": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_removed_kernel_drivers.yml", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 913, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1773930825.5808694, "ctime": 1773930825.3988702, "dev": 51716, "gid": 0, "gr_name": "root", "inode": 746586273, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773930825.3988702, "nlink": 1, "path": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/version_lock.yml", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 548, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}], "matched": 4, "msg": "All paths examined", "skipped_paths": {}} TASK [common_upgrade_tasks | Do remediation setup tasks] *********************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/common_upgrade_tasks.yml:20 included: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_cifs.yml for managed-node02 => (item=/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_cifs.yml) included: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_remote_using_root.yml for managed-node02 => (item=/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_remote_using_root.yml) included: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_removed_kernel_drivers.yml for managed-node02 => (item=/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_removed_kernel_drivers.yml) TASK [setup | remediate_cifs | Add a CIFS share to /etc/fstab] ***************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_cifs.yml:3 changed: [managed-node02] => {"backup": "", "changed": true, "msg": "line added"} TASK [setup | remediate_remote_using_root | Set the parameter to not remediate SSH password authentication] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_remote_using_root.yml:3 ok: [managed-node02] => {"ansible_facts": {"leapp_remediate_ssh_password_auth": false}, "changed": false} TASK [setup | remediate_removed_kernel_drivers | Set list of test kernel modules] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_removed_kernel_drivers.yml:4 ok: [managed-node02] => {"ansible_facts": {"leapp_test_kernel_modules": ["3w-9xxx", "pata_acpi", "tulip"]}, "changed": false} TASK [setup | remediate_removed_kernel_drivers | Load the test kernel modules] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_removed_kernel_drivers.yml:21 TASK [infra.leapp.common : manage_kernel_modules | Load or unload kernel modules] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_kernel_modules.yml:5 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml for managed-node02 => (item=3w-9xxx) included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml for managed-node02 => (item=pata_acpi) included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml for managed-node02 => (item=tulip) TASK [infra.leapp.common : manage_one_kernel_module | Load or unload kernel module] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:5 changed: [managed-node02] => {"changed": true, "name": "3w-9xxx", "params": "", "state": "present"} TASK [infra.leapp.common : manage_one_kernel_module | Disable modules-load.d file entry] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:17 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Ensure modules are not loaded at boot] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:26 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Rebuild initramfs when module config changed] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:35 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Debug modprobe.d file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:40 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Debug modules-load.d file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:44 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Load or unload kernel module] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:5 ok: [managed-node02] => {"changed": false, "name": "pata_acpi", "params": "", "state": "present"} TASK [infra.leapp.common : manage_one_kernel_module | Disable modules-load.d file entry] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:17 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Ensure modules are not loaded at boot] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:26 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Rebuild initramfs when module config changed] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:35 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Debug modprobe.d file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:40 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Debug modules-load.d file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:44 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Load or unload kernel module] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:5 changed: [managed-node02] => {"changed": true, "name": "tulip", "params": "", "state": "present"} TASK [infra.leapp.common : manage_one_kernel_module | Disable modules-load.d file entry] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:17 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Ensure modules are not loaded at boot] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:26 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Rebuild initramfs when module config changed] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:35 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Debug modprobe.d file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:40 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Debug modules-load.d file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:44 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [common_upgrade_tasks | Do setup tasks] *********************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/common_upgrade_tasks.yml:31 skipping: [managed-node02] => (item=/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/version_lock.yml) => {"ansible_loop_var": "setup_task_file", "changed": false, "setup_task_file": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/version_lock.yml", "skip_reason": "Conditional result was False"} skipping: [managed-node02] => {"changed": false, "msg": "All items skipped"} TASK [common_upgrade_tasks | Run first analysis] ******************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/common_upgrade_tasks.yml:42 TASK [infra.leapp.analysis : Lock timestamped variables] *********************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:5 ok: [managed-node02] => {"ansible_facts": {"__leapp_timestamp": "2026-03-19_14-41-27"}, "changed": false} TASK [Initialize lock, logging, and common vars] ******************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:9 TASK [infra.leapp.common : init_leapp_log | Ensure that log directory exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:9 changed: [managed-node02] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/var/log/leapp", "secontext": "unconfined_u:object_r:var_log_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : init_leapp_log | Check for existing log file] ******* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:17 ok: [managed-node02] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : init_leapp_log | Fail if log file already exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:22 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : init_leapp_log | Create new log file] *************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:31 changed: [managed-node02] => {"changed": true, "checksum": "683986eef3310963cbf5ef0ce57b4ec71df559ef", "dest": "/var/log/leapp/ansible_leapp_analysis.log", "gid": 0, "group": "root", "md5sum": "59ab43c8022cdc076421e81820112c3c", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 70, "src": "/root/.ansible/tmp/ansible-tmp-1773931288.9150202-9578-17725389020511/source", "state": "file", "uid": 0} TASK [infra.leapp.common : init_leapp_log | /etc/ansible/facts.d directory exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:41 changed: [managed-node02] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/ansible/facts.d", "secontext": "unconfined_u:object_r:etc_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : init_leapp_log | Capture current ansible_facts for validation after upgrade] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:49 changed: [managed-node02] => (item=/etc/ansible/facts.d/pre_ipu.fact) => {"ansible_loop_var": "item", "changed": true, "checksum": "33c5f505b8d14c28acf5463484de3c01f98f216c", "dest": "/etc/ansible/facts.d/pre_ipu.fact", "gid": 0, "group": "root", "item": "/etc/ansible/facts.d/pre_ipu.fact", "md5sum": "4361657d1be2d9f05774955d252dfce8", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 11999, "src": "/root/.ansible/tmp/ansible-tmp-1773931290.1667938-9726-5222066729646/source", "state": "file", "uid": 0} changed: [managed-node02] => (item=/var/log/leapp/ansible_leapp_analysis.log) => {"ansible_loop_var": "item", "changed": true, "checksum": "33c5f505b8d14c28acf5463484de3c01f98f216c", "dest": "/var/log/leapp/ansible_leapp_analysis.log", "gid": 0, "group": "root", "item": "/var/log/leapp/ansible_leapp_analysis.log", "md5sum": "4361657d1be2d9f05774955d252dfce8", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 11999, "src": "/root/.ansible/tmp/ansible-tmp-1773931290.7691894-9726-93112633773877/source", "state": "file", "uid": 0} TASK [infra.leapp.common : init_leapp_log | Capture a list of non-rhel versioned packages] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:60 ok: [managed-node02] => {"changed": false, "cmd": "set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\\.|+]el7' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort", "delta": "0:00:00.376663", "end": "2026-03-19 10:41:32.316469", "failed_when_result": false, "msg": "", "rc": 0, "start": "2026-03-19 10:41:31.939806", "stderr": "", "stderr_lines": [], "stdout": "epel-release-7-14.noarch\ntps-devel-2.44.50-1.noarch", "stdout_lines": ["epel-release-7-14.noarch", "tps-devel-2.44.50-1.noarch"]} TASK [infra.leapp.common : init_leapp_log | Create fact with the non-rhel versioned packages list] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:74 ok: [managed-node02] => {"ansible_facts": {"non_rhel_packages": ["epel-release-7-14.noarch", "tps-devel-2.44.50-1.noarch"]}, "changed": false} TASK [infra.leapp.common : init_leapp_log | Capture the list of non-rhel versioned packages in a separate fact file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:78 changed: [managed-node02] => (item=/etc/ansible/facts.d/non_rhel_packages.fact) => {"ansible_loop_var": "item", "changed": true, "checksum": "6d36b22d9c2b2f366fc090edfbac427c77d524a5", "dest": "/etc/ansible/facts.d/non_rhel_packages.fact", "gid": 0, "group": "root", "item": "/etc/ansible/facts.d/non_rhel_packages.fact", "md5sum": "a7d4e8abcc28ebc36ca5401fee060144", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 58, "src": "/root/.ansible/tmp/ansible-tmp-1773931292.509766-9963-22931329066399/source", "state": "file", "uid": 0} changed: [managed-node02] => (item=/var/log/leapp/ansible_leapp_analysis.log) => {"ansible_loop_var": "item", "changed": true, "checksum": "6d36b22d9c2b2f366fc090edfbac427c77d524a5", "dest": "/var/log/leapp/ansible_leapp_analysis.log", "gid": 0, "group": "root", "item": "/var/log/leapp/ansible_leapp_analysis.log", "md5sum": "a7d4e8abcc28ebc36ca5401fee060144", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 58, "src": "/root/.ansible/tmp/ansible-tmp-1773931293.0842752-9963-153484400679562/source", "state": "file", "uid": 0} TASK [infra.leapp.analysis : Include tasks for preupg assistant analysis] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:19 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : Include tasks for leapp preupgrade analysis] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:23 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml for managed-node02 TASK [analysis-leapp | Include pre_upgrade_update.yml] ************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:4 TASK [infra.leapp.common : pre_upgrade_update | Register with Satellite activation key] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade_update.yml:3 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [pre_upgrade_update | Include custom_local_repos for local_repos_pre_leapp] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade_update.yml:10 TASK [infra.leapp.common : custom_local_repos | Remove old /etc/leapp/files/leapp_upgrade_repositories.repo] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:2 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : custom_local_repos | Validate repo definitions have baseurl or metalink] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:9 skipping: [managed-node02] => {"changed": false, "skipped_reason": "No items in the list"} TASK [infra.leapp.common : custom_local_repos | Enable custom upgrade yum repositories] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:16 skipping: [managed-node02] => {"changed": false, "skipped_reason": "No items in the list"} TASK [infra.leapp.common : pre_upgrade_update | Get package version lock entries] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade_update.yml:21 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : pre_upgrade_update | Remove all package version locks] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade_update.yml:28 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : pre_upgrade_update | Install packages for upgrade from RHEL 7] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade_update.yml:36 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : pre_upgrade_update | Include update-and-reboot.yml] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade_update.yml:45 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : analysis-leapp | Ensure leapp log directory exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:11 changed: [managed-node02] => {"changed": true, "gid": 0, "group": "root", "mode": "0700", "owner": "root", "path": "/var/log/leapp", "secontext": "unconfined_u:object_r:var_log_t:s0", "size": 40, "state": "directory", "uid": 0} TASK [infra.leapp.analysis : analysis-leapp | Populate leapp_answers file] ***** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:19 changed: [managed-node02] => {"changed": true, "checksum": "3d934ad808576e3a7fb4c14a89645a4ad55ccf53", "dest": "/var/log/leapp/answerfile", "gid": 0, "group": "root", "md5sum": "01e375235c8e4cafdec593b260354063", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 48, "src": "/root/.ansible/tmp/ansible-tmp-1773931294.4437504-10117-246753698138930/source", "state": "file", "uid": 0} TASK [analysis-leapp | Create /etc/leapp/files/leapp_upgrade_repositories.repo] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:28 TASK [infra.leapp.common : custom_local_repos | Remove old /etc/leapp/files/leapp_upgrade_repositories.repo] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:2 ok: [managed-node02] => {"changed": false, "path": "/etc/leapp/files/leapp_upgrade_repositories.repo", "state": "absent"} TASK [infra.leapp.common : custom_local_repos | Validate repo definitions have baseurl or metalink] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:9 ok: [managed-node02] => (item={'name': 'rhel-8-for-x86_64-baseos-rpms', 'description': 'BaseOS for x86_64', 'baseurl': 'http://download.devel.redhat.com/rhel-8/nightly/updates/RHEL-8/latest-RHEL-8.10/compose/BaseOS/x86_64/os/', 'state': 'present'}) => { "ansible_loop_var": "item", "changed": false, "item": { "baseurl": "http://download.devel.redhat.com/rhel-8/nightly/updates/RHEL-8/latest-RHEL-8.10/compose/BaseOS/x86_64/os/", "description": "BaseOS for x86_64", "name": "rhel-8-for-x86_64-baseos-rpms", "state": "present" }, "msg": "All assertions passed" } ok: [managed-node02] => (item={'name': 'rhel-8-for-x86_64-appstream-rpms', 'description': 'AppStream for x86_64', 'baseurl': 'http://download.devel.redhat.com/rhel-8/nightly/updates/RHEL-8/latest-RHEL-8.10/compose/AppStream/x86_64/os/', 'state': 'present'}) => { "ansible_loop_var": "item", "changed": false, "item": { "baseurl": "http://download.devel.redhat.com/rhel-8/nightly/updates/RHEL-8/latest-RHEL-8.10/compose/AppStream/x86_64/os/", "description": "AppStream for x86_64", "name": "rhel-8-for-x86_64-appstream-rpms", "state": "present" }, "msg": "All assertions passed" } TASK [infra.leapp.common : custom_local_repos | Enable custom upgrade yum repositories] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:16 failed: [managed-node02] (item={'name': 'rhel-8-for-x86_64-baseos-rpms', 'description': 'BaseOS for x86_64', 'baseurl': 'http://download.devel.redhat.com/rhel-8/nightly/updates/RHEL-8/latest-RHEL-8.10/compose/BaseOS/x86_64/os/', 'state': 'present'}) => {"ansible_loop_var": "item", "changed": false, "details": "[Errno 2] No such file or directory: '/etc/leapp/files/leapp_upgrade_repositories.repo'", "item": {"baseurl": "http://download.devel.redhat.com/rhel-8/nightly/updates/RHEL-8/latest-RHEL-8.10/compose/BaseOS/x86_64/os/", "description": "BaseOS for x86_64", "name": "rhel-8-for-x86_64-baseos-rpms", "state": "present"}, "msg": "Problems handling file /etc/leapp/files/leapp_upgrade_repositories.repo."} failed: [managed-node02] (item={'name': 'rhel-8-for-x86_64-appstream-rpms', 'description': 'AppStream for x86_64', 'baseurl': 'http://download.devel.redhat.com/rhel-8/nightly/updates/RHEL-8/latest-RHEL-8.10/compose/AppStream/x86_64/os/', 'state': 'present'}) => {"ansible_loop_var": "item", "changed": false, "details": "[Errno 2] No such file or directory: '/etc/leapp/files/leapp_upgrade_repositories.repo'", "item": {"baseurl": "http://download.devel.redhat.com/rhel-8/nightly/updates/RHEL-8/latest-RHEL-8.10/compose/AppStream/x86_64/os/", "description": "AppStream for x86_64", "name": "rhel-8-for-x86_64-appstream-rpms", "state": "present"}, "msg": "Problems handling file /etc/leapp/files/leapp_upgrade_repositories.repo."} TASK [analysis-leapp | Include custom_local_repos for local_repos_post_analysis] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:70 TASK [infra.leapp.common : custom_local_repos | Remove old /etc/leapp/files/leapp_upgrade_repositories.repo] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:2 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : custom_local_repos | Validate repo definitions have baseurl or metalink] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:9 skipping: [managed-node02] => {"changed": false, "skipped_reason": "No items in the list"} TASK [infra.leapp.common : custom_local_repos | Enable custom upgrade yum repositories] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:16 skipping: [managed-node02] => {"changed": false, "skipped_reason": "No items in the list"} TASK [analysis-leapp | Restore original Satellite activation key] ************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:80 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [analysis-leapp | Copy reports to the controller] ************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:91 TASK [infra.leapp.common : copy_reports_to_controller | Ensure reports directory on controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_reports_to_controller.yml:20 changed: [managed-node02 -> localhost] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/ansible_leapp_analysis_logs_2026-03-19_14-41-27", "secontext": "unconfined_u:object_r:admin_home_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : copy_reports_to_controller | Fetch report files if they exist] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_reports_to_controller.yml:30 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml for managed-node02 => (item=/var/log/leapp/leapp-report.txt) included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml for managed-node02 => (item=/var/log/leapp/leapp-report.json) included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml for managed-node02 => (item=/var/log/leapp/leapp-preupgrade.log) TASK [infra.leapp.common : fetch_file_if_exists | Check if file exists] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:7 ok: [managed-node02] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : fetch_file_if_exists | Copy report file to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:12 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : fetch_file_if_exists | Check if file exists] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:7 ok: [managed-node02] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : fetch_file_if_exists | Copy report file to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:12 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : fetch_file_if_exists | Check if file exists] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:7 ok: [managed-node02] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : fetch_file_if_exists | Copy report file to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:12 skipping: [managed-node02] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [copy_reports_to_controller | Copy log file to the controller] ************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_reports_to_controller.yml:39 TASK [infra.leapp.common : copy_archive_leapp_log | Check for log file] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:11 ok: [managed-node02] => {"changed": false, "stat": {"atime": 1773931293.565951, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "6d36b22d9c2b2f366fc090edfbac427c77d524a5", "ctime": 1773931293.5669513, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 188743765, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773931293.37495, "nlink": 1, "path": "/var/log/leapp/ansible_leapp_analysis.log", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 58, "uid": 0, "version": "1943260242", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}} TASK [infra.leapp.common : copy_archive_leapp_log | Add end time to log file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:19 changed: [managed-node02] => {"backup": "", "changed": true, "msg": "line added"} TASK [infra.leapp.common : copy_archive_leapp_log | Slurp file /var/log/leapp/ansible_leapp_analysis.log] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:27 ok: [managed-node02] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false} TASK [infra.leapp.common : copy_archive_leapp_log | Decode file /var/log/leapp/ansible_leapp_analysis.log] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:33 ok: [managed-node02] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false} TASK [infra.leapp.common : copy_archive_leapp_log | Ensure reports directory on controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:38 ok: [managed-node02 -> localhost] => {"changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/ansible_leapp_analysis_logs_2026-03-19_14-41-27", "secontext": "unconfined_u:object_r:admin_home_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : copy_archive_leapp_log | Copy ansible leapp log to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:48 changed: [managed-node02] => {"changed": true, "checksum": "9edfd6bd47cc593f7a95649e2645a87bbad0da0f", "dest": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/ansible_leapp_analysis_logs_2026-03-19_14-41-27/managed-node02/ansible_leapp_analysis.log", "md5sum": "2379b44da4cbeb5abd60765d5a57b31e", "remote_checksum": "9edfd6bd47cc593f7a95649e2645a87bbad0da0f", "remote_md5sum": null} TASK [infra.leapp.common : copy_archive_leapp_log | Copy log file to timestamped location] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:54 changed: [managed-node02] => {"changed": true, "checksum": "9edfd6bd47cc593f7a95649e2645a87bbad0da0f", "dest": "/var/log/leapp/ansible_leapp_analysis_2026-03-19_14-41-27.log", "gid": 0, "group": "root", "md5sum": "2379b44da4cbeb5abd60765d5a57b31e", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 93, "src": "/var/log/leapp/ansible_leapp_analysis.log", "state": "file", "uid": 0} TASK [infra.leapp.common : copy_archive_leapp_log | Remove original log file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:66 changed: [managed-node02] => {"changed": true, "path": "/var/log/leapp/ansible_leapp_analysis.log", "state": "absent"} TASK [tests_upgrade_custom | Include cleanup logs] ***************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/tests_upgrade_custom.yml:49 TASK [infra.leapp.common : cleanup_logs | Cleanup | Remove log files] ********** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/cleanup_logs.yml:2 changed: [managed-node02] => {"changed": true, "cmd": "set -euxo pipefail\nrm -f /var/log/leapp/*.log\nrm -f /var/log/leapp/*.json\nrm -f /var/log/leapp/*.txt\n", "delta": "0:00:00.005674", "end": "2026-03-19 10:41:40.648371", "msg": "", "rc": 0, "start": "2026-03-19 10:41:40.642697", "stderr": "+ rm -f /var/log/leapp/ansible_leapp_analysis_2026-03-19_14-41-27.log\n+ rm -f '/var/log/leapp/*.json'\n+ rm -f '/var/log/leapp/*.txt'", "stderr_lines": ["+ rm -f /var/log/leapp/ansible_leapp_analysis_2026-03-19_14-41-27.log", "+ rm -f '/var/log/leapp/*.json'", "+ rm -f '/var/log/leapp/*.txt'"], "stdout": "", "stdout_lines": []} PLAY RECAP ********************************************************************* managed-node02 : ok=48 changed=16 unreachable=0 failed=1 skipped=33 rescued=0 ignored=0 -- Logs begin at Thu 2026-03-19 10:31:08 EDT, end at Thu 2026-03-19 10:41:41 EDT. -- Mar 19 10:41:21 managed-node02 sshd[4382]: Accepted publickey for root from 10.31.42.92 port 56830 ssh2: ECDSA SHA256:rjYXslfDH16LW7x9xHtQw01WN25hBNk8QHaKBGTw0gY Mar 19 10:41:21 managed-node02 systemd[1]: Started Session 8 of user root. -- Subject: Unit session-8.scope has finished start-up -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- -- Unit session-8.scope has finished starting up. -- -- The start-up result is done. Mar 19 10:41:21 managed-node02 systemd-logind[547]: New session 8 of user root. -- Subject: A new session 8 has been created for user root -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- Documentation: http://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 8 has been created for the user root. -- -- The leading process of the session is 4382. Mar 19 10:41:21 managed-node02 sshd[4382]: pam_unix(sshd:session): session opened for user root by (uid=0) Mar 19 10:41:22 managed-node02 ansible-ansible.legacy.setup[4453]: Invoked with filter=[] gather_subset=['all'] fact_path=/etc/ansible/facts.d gather_timeout=10 Mar 19 10:41:22 managed-node02 ansible-ansible.builtin.stat[4543]: Invoked with checksum_algorithm=sha1 get_checksum=True follow=False path=/var/log/leapp/leapp-upgrade.log get_md5=False get_mime=True get_attributes=True Mar 19 10:41:23 managed-node02 ansible-ansible.legacy.yum[4604]: Invoked with lock_timeout=30 update_cache=False conf_file=None exclude=[] allow_downgrade=False sslverify=True disable_gpg_check=False disable_excludes=None use_backend=auto validate_certs=True state=absent disablerepo=[] skip_broken=False releasever=None cacheonly=False autoremove=False download_dir=None installroot=/ install_weak_deps=True name=['leapp-upgrade'] download_only=False bugfix=False list=None install_repoquery=True update_only=False disable_plugin=[] enablerepo=[] security=False enable_plugin=[] Mar 19 10:41:25 managed-node02 ansible-ansible.builtin.lineinfile[4669]: Invoked with group=None insertbefore=None unsafe_writes=False selevel=None create=False seuser=None serole=None owner=None backrefs=False search_string=None state=present firstmatch=False mode=None path=/etc/fstab insertafter=None regexp=None line=//127.0.0.1/test_remediate_cifs /mnt/cifs cifs username=test,password=test 0 0 attributes=None backup=False validate=None setype=None Mar 19 10:41:25 managed-node02 ansible-infra.leapp.modprobe[4730]: Invoked with state=present params= name=3w-9xxx persistent=disabled Mar 19 10:41:25 managed-node02 kernel: 3ware 9000 Storage Controller device driver for Linux v2.26.02.014.rh1. Mar 19 10:41:26 managed-node02 ansible-infra.leapp.modprobe[4796]: Invoked with state=present params= name=pata_acpi persistent=disabled Mar 19 10:41:26 managed-node02 ansible-infra.leapp.modprobe[4859]: Invoked with state=present params= name=tulip persistent=disabled Mar 19 10:41:27 managed-node02 kernel: tulip: Linux Tulip driver version 1.1.15 (Feb 27, 2007) Mar 19 10:41:28 managed-node02 ansible-ansible.builtin.file[4923]: Invoked with src=None selevel=None force=False setype=None _original_basename=None unsafe_writes=False access_time=None seuser=None recurse=False state=directory access_time_format=%Y%m%d%H%M.%S group=root modification_time=None serole=None _diff_peek=None modification_time_format=%Y%m%d%H%M.%S path=/var/log/leapp owner=root follow=True attributes=None mode=0755 Mar 19 10:41:28 managed-node02 ansible-ansible.builtin.stat[4984]: Invoked with checksum_algorithm=sha1 get_checksum=True follow=False path=/var/log/leapp/ansible_leapp_analysis.log get_md5=False get_mime=True get_attributes=True Mar 19 10:41:29 managed-node02 ansible-ansible.legacy.stat[5045]: Invoked with checksum_algorithm=sha1 get_checksum=True path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_md5=False get_mime=True get_attributes=True Mar 19 10:41:29 managed-node02 ansible-ansible.legacy.copy[5091]: Invoked with src=/root/.ansible/tmp/ansible-tmp-1773931288.9150202-9578-17725389020511/source directory_mode=None force=True attributes=None remote_src=None unsafe_writes=False dest=/var/log/leapp/ansible_leapp_analysis.log seuser=None setype=None group=root content=NOT_LOGGING_PARAMETER _original_basename=tmpt7xbjf5_ serole=None mode=0644 selevel=None owner=root follow=False validate=None checksum=683986eef3310963cbf5ef0ce57b4ec71df559ef backup=False local_follow=None Mar 19 10:41:30 managed-node02 ansible-ansible.builtin.file[5152]: Invoked with src=None selevel=None force=False setype=None _original_basename=None unsafe_writes=False access_time=None seuser=None recurse=False state=directory access_time_format=%Y%m%d%H%M.%S group=root modification_time=None serole=None _diff_peek=None modification_time_format=%Y%m%d%H%M.%S path=/etc/ansible/facts.d owner=root follow=True attributes=None mode=0755 Mar 19 10:41:30 managed-node02 ansible-ansible.legacy.stat[5213]: Invoked with checksum_algorithm=sha1 get_checksum=True path=/etc/ansible/facts.d/pre_ipu.fact follow=False get_md5=False get_mime=True get_attributes=True Mar 19 10:41:30 managed-node02 ansible-ansible.legacy.copy[5259]: Invoked with src=/root/.ansible/tmp/ansible-tmp-1773931290.1667938-9726-5222066729646/source directory_mode=None force=True attributes=None remote_src=None unsafe_writes=False dest=/etc/ansible/facts.d/pre_ipu.fact seuser=None setype=None group=root content=NOT_LOGGING_PARAMETER _original_basename=tmpnnel5qk_ serole=None mode=0644 selevel=None owner=root follow=False validate=None checksum=33c5f505b8d14c28acf5463484de3c01f98f216c backup=False local_follow=None Mar 19 10:41:30 managed-node02 ansible-ansible.legacy.stat[5320]: Invoked with checksum_algorithm=sha1 get_checksum=True path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_md5=False get_mime=True get_attributes=True Mar 19 10:41:31 managed-node02 ansible-ansible.legacy.copy[5368]: Invoked with src=/root/.ansible/tmp/ansible-tmp-1773931290.7691894-9726-93112633773877/source directory_mode=None force=True attributes=None remote_src=None unsafe_writes=False dest=/var/log/leapp/ansible_leapp_analysis.log seuser=None setype=None group=root content=NOT_LOGGING_PARAMETER _original_basename=tmp667xy685 serole=None mode=0644 selevel=None owner=root follow=False validate=None checksum=33c5f505b8d14c28acf5463484de3c01f98f216c backup=False local_follow=None Mar 19 10:41:31 managed-node02 ansible-ansible.legacy.command[5429]: Invoked with executable=None _uses_shell=True strip_empty_ends=True _raw_params=set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\.|+]el7' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort removes=None argv=None creates=None chdir=None stdin_add_newline=True stdin=None Mar 19 10:41:32 managed-node02 ansible-ansible.legacy.stat[5495]: Invoked with checksum_algorithm=sha1 get_checksum=True path=/etc/ansible/facts.d/non_rhel_packages.fact follow=False get_md5=False get_mime=True get_attributes=True Mar 19 10:41:32 managed-node02 ansible-ansible.legacy.copy[5541]: Invoked with src=/root/.ansible/tmp/ansible-tmp-1773931292.509766-9963-22931329066399/source directory_mode=None force=True attributes=None remote_src=None unsafe_writes=False dest=/etc/ansible/facts.d/non_rhel_packages.fact seuser=None setype=None group=root content=NOT_LOGGING_PARAMETER _original_basename=tmpih8gjn09 serole=None mode=0644 selevel=None owner=root follow=False validate=None checksum=6d36b22d9c2b2f366fc090edfbac427c77d524a5 backup=False local_follow=None Mar 19 10:41:33 managed-node02 ansible-ansible.legacy.stat[5602]: Invoked with checksum_algorithm=sha1 get_checksum=True path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_md5=False get_mime=True get_attributes=True Mar 19 10:41:33 managed-node02 ansible-ansible.legacy.copy[5650]: Invoked with src=/root/.ansible/tmp/ansible-tmp-1773931293.0842752-9963-153484400679562/source directory_mode=None force=True attributes=None remote_src=None unsafe_writes=False dest=/var/log/leapp/ansible_leapp_analysis.log seuser=None setype=None group=root content=NOT_LOGGING_PARAMETER _original_basename=tmp4nrgm7dg serole=None mode=0644 selevel=None owner=root follow=False validate=None checksum=6d36b22d9c2b2f366fc090edfbac427c77d524a5 backup=False local_follow=None Mar 19 10:41:34 managed-node02 ansible-ansible.builtin.file[5711]: Invoked with src=None selevel=None force=False setype=None _original_basename=None unsafe_writes=False access_time=None seuser=None recurse=False state=directory access_time_format=%Y%m%d%H%M.%S group=root modification_time=None serole=None _diff_peek=None modification_time_format=%Y%m%d%H%M.%S path=/var/log/leapp owner=root follow=True attributes=None mode=0700 Mar 19 10:41:34 managed-node02 ansible-ansible.legacy.stat[5772]: Invoked with checksum_algorithm=sha1 get_checksum=True path=/var/log/leapp/answerfile follow=False get_md5=False get_mime=True get_attributes=True Mar 19 10:41:34 managed-node02 ansible-ansible.legacy.copy[5818]: Invoked with src=/root/.ansible/tmp/ansible-tmp-1773931294.4437504-10117-246753698138930/source directory_mode=None force=True attributes=None remote_src=None unsafe_writes=False dest=/var/log/leapp/answerfile seuser=None setype=None group=root content=NOT_LOGGING_PARAMETER _original_basename=tmpckgin8fq serole=None mode=0644 selevel=None owner=root follow=False validate=None checksum=3d934ad808576e3a7fb4c14a89645a4ad55ccf53 backup=False local_follow=None Mar 19 10:41:35 managed-node02 ansible-ansible.builtin.file[5879]: Invoked with src=None selevel=None force=False setype=None _original_basename=None unsafe_writes=False access_time=None seuser=None recurse=False state=absent access_time_format=%Y%m%d%H%M.%S group=None modification_time=None serole=None _diff_peek=None modification_time_format=%Y%m%d%H%M.%S path=/etc/leapp/files/leapp_upgrade_repositories.repo owner=None follow=True attributes=None mode=None Mar 19 10:41:35 managed-node02 ansible-ansible.builtin.yum_repository[5940]: Invoked with metalink=None ip_resolve=None enabled=True proxy_password=NOT_LOGGING_PARAMETER mode=0644 mirrorlist_expire=None bandwidth=None cost=None file=/etc/leapp/files/leapp_upgrade_repositories owner=root exclude=None keepalive=None repo_gpgcheck=None group=root failovermethod=None unsafe_writes=False deltarpm_metadata_percentage=None gpgkey=None setype=None http_caching=None priority=None state=present mirrorlist=None params=None gpgcheck=False include=None sslcacert=None username=None metadata_expire=None description=BaseOS for x86_64 retries=None selevel=None sslclientcert=None gpgcakey=None baseurl=['http://download.devel.redhat.com/rhel-8/nightly/updates/RHEL-8/latest-RHEL-8.10/compose/BaseOS/x86_64/os/'] s3_enabled=None ssl_check_cert_permissions=None includepkgs=None async=None sslverify=None password=NOT_LOGGING_PARAMETER ui_repoid_vars=None protect=None serole=None throttle=None name=rhel-8-for-x86_64-baseos-rpms deltarpm_percentage=None sslclientkey=None seuser=None reposdir=/etc/yum.repos.d skip_if_unavailable=None module_hotfixes=None keepcache=None proxy_username=None timeout=None attributes=None metadata_expire_filter=None enablegroups=None proxy=None Mar 19 10:41:36 managed-node02 ansible-ansible.builtin.yum_repository[6000]: Invoked with metalink=None ip_resolve=None enabled=True proxy_password=NOT_LOGGING_PARAMETER mode=0644 mirrorlist_expire=None bandwidth=None cost=None file=/etc/leapp/files/leapp_upgrade_repositories owner=root exclude=None keepalive=None repo_gpgcheck=None group=root failovermethod=None unsafe_writes=False deltarpm_metadata_percentage=None gpgkey=None setype=None http_caching=None priority=None state=present mirrorlist=None params=None gpgcheck=False include=None sslcacert=None username=None metadata_expire=None description=AppStream for x86_64 retries=None selevel=None sslclientcert=None gpgcakey=None baseurl=['http://download.devel.redhat.com/rhel-8/nightly/updates/RHEL-8/latest-RHEL-8.10/compose/AppStream/x86_64/os/'] s3_enabled=None ssl_check_cert_permissions=None includepkgs=None async=None sslverify=None password=NOT_LOGGING_PARAMETER ui_repoid_vars=None protect=None serole=None throttle=None name=rhel-8-for-x86_64-appstream-rpms deltarpm_percentage=None sslclientkey=None seuser=None reposdir=/etc/yum.repos.d skip_if_unavailable=None module_hotfixes=None keepcache=None proxy_username=None timeout=None attributes=None metadata_expire_filter=None enablegroups=None proxy=None Mar 19 10:41:37 managed-node02 ansible-ansible.builtin.stat[6060]: Invoked with checksum_algorithm=sha1 get_checksum=True follow=False path=/var/log/leapp/leapp-report.txt get_md5=False get_mime=True get_attributes=True Mar 19 10:41:37 managed-node02 ansible-ansible.builtin.stat[6121]: Invoked with checksum_algorithm=sha1 get_checksum=True follow=False path=/var/log/leapp/leapp-report.json get_md5=False get_mime=True get_attributes=True Mar 19 10:41:37 managed-node02 ansible-ansible.builtin.stat[6182]: Invoked with checksum_algorithm=sha1 get_checksum=True follow=False path=/var/log/leapp/leapp-preupgrade.log get_md5=False get_mime=True get_attributes=True Mar 19 10:41:38 managed-node02 ansible-ansible.builtin.stat[6243]: Invoked with checksum_algorithm=sha1 get_checksum=True follow=False path=/var/log/leapp/ansible_leapp_analysis.log get_md5=False get_mime=True get_attributes=True Mar 19 10:41:38 managed-node02 ansible-ansible.builtin.lineinfile[6306]: Invoked with group=root insertbefore=None unsafe_writes=False selevel=None create=False seuser=None serole=None backrefs=False search_string=None state=present firstmatch=False mode=0644 insertafter=None path=/var/log/leapp/ansible_leapp_analysis.log owner=root regexp=None line=Job ended at 2026-03-19T14:41:38Z attributes=None backup=False validate=None setype=None Mar 19 10:41:39 managed-node02 ansible-ansible.legacy.stat[6428]: Invoked with checksum_algorithm=sha1 get_checksum=True path=/var/log/leapp/ansible_leapp_analysis.log follow=True get_md5=False get_mime=True get_attributes=True Mar 19 10:41:39 managed-node02 ansible-ansible.legacy.copy[6498]: Invoked with src=/var/log/leapp/ansible_leapp_analysis.log directory_mode=None force=True unsafe_writes=False remote_src=True dest=/var/log/leapp/ansible_leapp_analysis_2026-03-19_14-41-27.log selevel=None seuser=None setype=None group=None content=NOT_LOGGING_PARAMETER _original_basename=None serole=None mode=preserve checksum=None owner=None follow=False validate=None attributes=None backup=False local_follow=None Mar 19 10:41:40 managed-node02 ansible-ansible.builtin.file[6559]: Invoked with src=None selevel=None force=False setype=None _original_basename=None unsafe_writes=False access_time=None seuser=None recurse=False state=absent access_time_format=%Y%m%d%H%M.%S group=None modification_time=None serole=None _diff_peek=None modification_time_format=%Y%m%d%H%M.%S path=/var/log/leapp/ansible_leapp_analysis.log owner=None follow=True attributes=None mode=None Mar 19 10:41:40 managed-node02 ansible-ansible.legacy.command[6620]: Invoked with executable=None _uses_shell=True strip_empty_ends=True _raw_params=set -euxo pipefail rm -f /var/log/leapp/*.log rm -f /var/log/leapp/*.json rm -f /var/log/leapp/*.txt removes=None argv=None creates=None chdir=None stdin_add_newline=True stdin=None Mar 19 10:41:40 managed-node02 sshd[6634]: Accepted publickey for root from 10.31.42.92 port 36338 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Mar 19 10:41:41 managed-node02 systemd-logind[547]: New session 9 of user root. -- Subject: A new session 9 has been created for user root -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- Documentation: http://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 9 has been created for the user root. -- -- The leading process of the session is 6634. Mar 19 10:41:41 managed-node02 systemd[1]: Started Session 9 of user root. -- Subject: Unit session-9.scope has finished start-up -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- -- Unit session-9.scope has finished starting up. -- -- The start-up result is done. Mar 19 10:41:41 managed-node02 sshd[6634]: pam_unix(sshd:session): session opened for user root by (uid=0) Mar 19 10:41:41 managed-node02 sshd[6634]: Received disconnect from 10.31.42.92 port 36338:11: disconnected by user Mar 19 10:41:41 managed-node02 sshd[6634]: Disconnected from 10.31.42.92 port 36338 Mar 19 10:41:41 managed-node02 sshd[6634]: pam_unix(sshd:session): session closed for user root Mar 19 10:41:41 managed-node02 systemd-logind[547]: Removed session 9. -- Subject: Session 9 has been terminated -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- Documentation: http://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 9 has been terminated. Mar 19 10:41:41 managed-node02 sshd[6646]: Accepted publickey for root from 10.31.42.92 port 36350 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Mar 19 10:41:41 managed-node02 systemd-logind[547]: New session 10 of user root. -- Subject: A new session 10 has been created for user root -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- Documentation: http://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 10 has been created for the user root. -- -- The leading process of the session is 6646. Mar 19 10:41:41 managed-node02 systemd[1]: Started Session 10 of user root. -- Subject: Unit session-10.scope has finished start-up -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- -- Unit session-10.scope has finished starting up. -- -- The start-up result is done. Mar 19 10:41:41 managed-node02 sshd[6646]: pam_unix(sshd:session): session opened for user root by (uid=0)