[WARNING]: Collection infra.leapp does not support Ansible version 2.14.18 [WARNING]: running playbook inside collection infra.leapp ansible-playbook [core 2.14.18] config file = /etc/ansible/ansible.cfg configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python3.9/site-packages/ansible ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections executable location = /usr/bin/ansible-playbook python version = 3.9.25 (main, Mar 9 2026, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-14)] (/usr/bin/python3) jinja version = 3.1.2 libyaml = True Using /etc/ansible/ansible.cfg as config file Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_remediations_8to9.yml ****************************************** 1 plays in /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_remediations_8to9.yml PLAY [Test RHEL 8 to 9 remediations] ******************************************* TASK [Gathering Facts] ********************************************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_remediations_8to9.yml:2 ok: [managed-node03] TASK [Include tests_upgrade_custom playbook] *********************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_remediations_8to9.yml:22 included: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/tests_upgrade_custom.yml for managed-node03 TASK [tests_upgrade_custom | Check if leapp upgrade log exists] **************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/tests_upgrade_custom.yml:12 ok: [managed-node03] => {"changed": false, "stat": {"exists": false}} TASK [tests_upgrade_custom | Skip test if already upgraded or not RHEL {{ rhel_base_ver }}] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/tests_upgrade_custom.yml:17 META: end_play conditional evaluated to False, continuing play skipping: [managed-node03] => {"msg": "end_play", "skip_reason": "end_play conditional evaluated to False, continuing play"} TASK [tests_upgrade_custom | Include common upgrade tasks] ********************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/tests_upgrade_custom.yml:27 included: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/common_upgrade_tasks.yml for managed-node03 TASK [common_upgrade_tasks | Remove leapp packages] **************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/common_upgrade_tasks.yml:6 ok: [managed-node03] => {"changed": false, "msg": "Nothing to do", "rc": 0, "results": []} TASK [common_upgrade_tasks | Gather setup tasks] ******************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/common_upgrade_tasks.yml:11 ok: [managed-node03 -> localhost] => {"changed": false, "examined": 4, "files": [{"atime": 1773838847.4482205, "ctime": 1773838847.2592201, "dev": 51716, "gid": 0, "gr_name": "root", "inode": 746586270, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773838847.2592201, "nlink": 1, "path": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_cifs.yml", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 272, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1773838847.4482205, "ctime": 1773838847.2592201, "dev": 51716, "gid": 0, "gr_name": "root", "inode": 746586271, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773838847.2592201, "nlink": 1, "path": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_remote_using_root.yml", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 268, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1773838847.4482205, "ctime": 1773838847.2592201, "dev": 51716, "gid": 0, "gr_name": "root", "inode": 746586272, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773838847.2592201, "nlink": 1, "path": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_removed_kernel_drivers.yml", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 913, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1773838847.4482205, "ctime": 1773838847.2592201, "dev": 51716, "gid": 0, "gr_name": "root", "inode": 746586273, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773838847.2592201, "nlink": 1, "path": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/version_lock.yml", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 548, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}], "matched": 4, "msg": "All paths examined", "skipped_paths": {}} TASK [common_upgrade_tasks | Do remediation setup tasks] *********************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/common_upgrade_tasks.yml:20 included: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_cifs.yml for managed-node03 => (item=/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_cifs.yml) included: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_remote_using_root.yml for managed-node03 => (item=/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_remote_using_root.yml) included: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_removed_kernel_drivers.yml for managed-node03 => (item=/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_removed_kernel_drivers.yml) TASK [setup | remediate_cifs | Add a CIFS share to /etc/fstab] ***************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_cifs.yml:3 changed: [managed-node03] => {"backup": "", "changed": true, "msg": "line added"} TASK [setup | remediate_remote_using_root | Set the parameter to not remediate SSH password authentication] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_remote_using_root.yml:3 ok: [managed-node03] => {"ansible_facts": {"leapp_remediate_ssh_password_auth": false}, "changed": false} TASK [setup | remediate_removed_kernel_drivers | Set list of test kernel modules] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_removed_kernel_drivers.yml:4 ok: [managed-node03] => {"ansible_facts": {"leapp_test_kernel_modules": ["dnet", "dlci", "liquidio"]}, "changed": false} TASK [setup | remediate_removed_kernel_drivers | Load the test kernel modules] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_removed_kernel_drivers.yml:21 TASK [infra.leapp.common : manage_kernel_modules | Load or unload kernel modules] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_kernel_modules.yml:5 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml for managed-node03 => (item=dnet) included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml for managed-node03 => (item=dlci) included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml for managed-node03 => (item=liquidio) TASK [infra.leapp.common : manage_one_kernel_module | Load or unload kernel module] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:5 changed: [managed-node03] => {"changed": true, "name": "dnet", "params": "", "state": "present"} TASK [infra.leapp.common : manage_one_kernel_module | Disable modules-load.d file entry] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:17 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Ensure modules are not loaded at boot] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:25 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Debug modprobe.d file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:33 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Debug modules-load.d file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:37 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Load or unload kernel module] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:5 changed: [managed-node03] => {"changed": true, "name": "dlci", "params": "", "state": "present"} TASK [infra.leapp.common : manage_one_kernel_module | Disable modules-load.d file entry] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:17 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Ensure modules are not loaded at boot] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:25 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Debug modprobe.d file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:33 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Debug modules-load.d file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:37 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Load or unload kernel module] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:5 changed: [managed-node03] => {"changed": true, "name": "liquidio", "params": "", "state": "present"} TASK [infra.leapp.common : manage_one_kernel_module | Disable modules-load.d file entry] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:17 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Ensure modules are not loaded at boot] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:25 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Debug modprobe.d file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:33 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : manage_one_kernel_module | Debug modules-load.d file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/manage_one_kernel_module.yml:37 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [common_upgrade_tasks | Do setup tasks] *********************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/common_upgrade_tasks.yml:31 skipping: [managed-node03] => (item=/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/version_lock.yml) => {"ansible_loop_var": "setup_task_file", "changed": false, "setup_task_file": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/version_lock.yml", "skip_reason": "Conditional result was False"} skipping: [managed-node03] => {"changed": false, "msg": "All items skipped"} TASK [common_upgrade_tasks | Run first analysis] ******************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/common_upgrade_tasks.yml:42 TASK [infra.leapp.analysis : Lock timestamped variables] *********************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:5 ok: [managed-node03] => {"ansible_facts": {"__leapp_timestamp": "2026-03-18_13-06-35"}, "changed": false} TASK [Initialize lock, logging, and common vars] ******************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:9 TASK [infra.leapp.common : init_leapp_log | Ensure that log directory exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:9 changed: [managed-node03] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/var/log/leapp", "secontext": "unconfined_u:object_r:var_log_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : init_leapp_log | Check for existing log file] ******* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:17 ok: [managed-node03] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : init_leapp_log | Fail if log file already exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:22 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : init_leapp_log | Create new log file] *************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:31 changed: [managed-node03] => {"changed": true, "checksum": "8968b09195716bd817b7dec596693ebe6ed1d959", "dest": "/var/log/leapp/ansible_leapp_analysis.log", "gid": 0, "group": "root", "md5sum": "2f337981ababd3cf115e09af9fa4d566", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 70, "src": "/root/.ansible/tmp/ansible-tmp-1773839196.5226629-9756-169623907402954/source", "state": "file", "uid": 0} TASK [infra.leapp.common : init_leapp_log | /etc/ansible/facts.d directory exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:41 changed: [managed-node03] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/ansible/facts.d", "secontext": "unconfined_u:object_r:etc_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : init_leapp_log | Capture current ansible_facts for validation after upgrade] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:49 changed: [managed-node03] => (item=/etc/ansible/facts.d/pre_ipu.fact) => {"ansible_loop_var": "item", "changed": true, "checksum": "044e7b8239320cf71acb917fcc4ea0c6d22fa2aa", "dest": "/etc/ansible/facts.d/pre_ipu.fact", "gid": 0, "group": "root", "item": "/etc/ansible/facts.d/pre_ipu.fact", "md5sum": "fa18f428fa28612a1051e4dab20dc921", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 13839, "src": "/root/.ansible/tmp/ansible-tmp-1773839197.6892102-9809-70308775957701/source", "state": "file", "uid": 0} changed: [managed-node03] => (item=/var/log/leapp/ansible_leapp_analysis.log) => {"ansible_loop_var": "item", "changed": true, "checksum": "044e7b8239320cf71acb917fcc4ea0c6d22fa2aa", "dest": "/var/log/leapp/ansible_leapp_analysis.log", "gid": 0, "group": "root", "item": "/var/log/leapp/ansible_leapp_analysis.log", "md5sum": "fa18f428fa28612a1051e4dab20dc921", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 13839, "src": "/root/.ansible/tmp/ansible-tmp-1773839198.3519363-9809-182013657867994/source", "state": "file", "uid": 0} TASK [infra.leapp.common : init_leapp_log | Capture a list of non-rhel versioned packages] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:60 ok: [managed-node03] => {"changed": false, "cmd": "set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\\.|+]el8' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort", "delta": "0:00:00.857330", "end": "2026-03-18 09:06:40.332845", "failed_when_result": false, "msg": "non-zero return code", "rc": 1, "start": "2026-03-18 09:06:39.475515", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [infra.leapp.common : init_leapp_log | Create fact with the non-rhel versioned packages list] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:74 ok: [managed-node03] => {"ansible_facts": {"non_rhel_packages": []}, "changed": false} TASK [infra.leapp.common : init_leapp_log | Capture the list of non-rhel versioned packages in a separate fact file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:78 changed: [managed-node03] => (item=/etc/ansible/facts.d/non_rhel_packages.fact) => {"ansible_loop_var": "item", "changed": true, "checksum": "97d170e1550eee4afc0af065b78cda302a97674c", "dest": "/etc/ansible/facts.d/non_rhel_packages.fact", "gid": 0, "group": "root", "item": "/etc/ansible/facts.d/non_rhel_packages.fact", "md5sum": "d751713988987e9331980363e24189ce", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 2, "src": "/root/.ansible/tmp/ansible-tmp-1773839200.466031-9928-98401967209276/source", "state": "file", "uid": 0} changed: [managed-node03] => (item=/var/log/leapp/ansible_leapp_analysis.log) => {"ansible_loop_var": "item", "changed": true, "checksum": "97d170e1550eee4afc0af065b78cda302a97674c", "dest": "/var/log/leapp/ansible_leapp_analysis.log", "gid": 0, "group": "root", "item": "/var/log/leapp/ansible_leapp_analysis.log", "md5sum": "d751713988987e9331980363e24189ce", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 2, "src": "/root/.ansible/tmp/ansible-tmp-1773839201.1323354-9928-221266378357809/source", "state": "file", "uid": 0} TASK [infra.leapp.analysis : Include tasks for preupg assistant analysis] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:19 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : Include tasks for leapp preupgrade analysis] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:23 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml for managed-node03 TASK [analysis-leapp | Include pre_upgrade.yml] ******************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:4 TASK [infra.leapp.common : pre_upgrade | Register with Satellite activation key] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade.yml:3 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [pre_upgrade | Include custom_local_repos for local_repos_pre_leapp] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade.yml:10 TASK [infra.leapp.common : custom_local_repos | Remove old /etc/leapp/files/leapp_upgrade_repositories.repo] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:2 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : custom_local_repos | Validate repo definitions have baseurl or metalink] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:9 skipping: [managed-node03] => {"changed": false, "skipped_reason": "No items in the list"} TASK [infra.leapp.common : custom_local_repos | Enable custom upgrade yum repositories] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:16 skipping: [managed-node03] => {"changed": false, "skipped_reason": "No items in the list"} TASK [infra.leapp.common : pre_upgrade | Get package version lock entries] ***** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade.yml:21 ok: [managed-node03] => {"changed": false, "cmd": ["dnf", "versionlock", "list"], "delta": "0:00:00.353499", "end": "2026-03-18 09:06:42.821606", "failed_when_result": false, "msg": "non-zero return code", "rc": 1, "start": "2026-03-18 09:06:42.468107", "stderr": "No such command: versionlock. Please use /usr/bin/dnf --help\nIt could be a DNF plugin command, try: \"dnf install 'dnf-command(versionlock)'\"", "stderr_lines": ["No such command: versionlock. Please use /usr/bin/dnf --help", "It could be a DNF plugin command, try: \"dnf install 'dnf-command(versionlock)'\""], "stdout": "", "stdout_lines": []} TASK [infra.leapp.common : pre_upgrade | Remove all package version locks] ***** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade.yml:28 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : pre_upgrade | Install packages for upgrade from RHEL 7] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade.yml:36 changed: [managed-node03] => {"changed": true, "msg": "", "rc": 0, "results": ["Installed: policycoreutils-python-utils-2.9-26.el8_10.noarch", "Installed: leapp-0.20.0-1.el8_10.noarch", "Installed: leapp-deps-0.20.0-1.el8_10.noarch", "Installed: leapp-upgrade-el8toel9-0.23.0-1.el8_10.noarch", "Installed: leapp-upgrade-el8toel9-deps-0.23.0-1.el8_10.noarch", "Installed: systemd-container-239-82.el8_10.15.x86_64", "Installed: python3-leapp-0.20.0-1.el8_10.noarch"]} TASK [infra.leapp.common : pre_upgrade | Include update-and-reboot.yml] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade.yml:45 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/update-and-reboot.yml for managed-node03 TASK [infra.leapp.common : update-and-reboot | Ensure all updates are applied] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/update-and-reboot.yml:2 ASYNC OK on managed-node03: jid=j836737144146.7600 changed: [managed-node03] => {"ansible_job_id": "j836737144146.7600", "changed": true, "finished": 1, "msg": "", "rc": 0, "results": ["Installed: libatasmart-0.19-14.el8.x86_64", "Installed: libblockdev-2.28-7.el8_10.x86_64", "Installed: libblockdev-crypto-2.28-7.el8_10.x86_64", "Installed: libsmbios-2.4.1-2.el8.x86_64", "Installed: libblockdev-fs-2.28-7.el8_10.x86_64", "Installed: bubblewrap-0.4.0-2.el8_10.x86_64", "Installed: libblockdev-loop-2.28-7.el8_10.x86_64", "Installed: glibc-headers-2.28-251.el8_10.31.x86_64", "Installed: libblockdev-mdraid-2.28-7.el8_10.x86_64", "Installed: udisks2-2.9.0-16.el8_10.1.x86_64", "Installed: libblockdev-part-2.28-7.el8_10.x86_64", "Installed: libblockdev-swap-2.28-7.el8_10.x86_64", "Installed: libblockdev-utils-2.28-7.el8_10.x86_64", "Installed: glibc-langpack-en-2.28-251.el8_10.31.x86_64", "Installed: libudisks2-2.9.0-16.el8_10.1.x86_64", "Installed: libbytesize-1.4-3.el8.x86_64", "Installed: mdadm-4.2-19.el8_10.x86_64", "Installed: libnfsidmap-1:2.3.3-68.el8_10.x86_64", "Installed: fwupd-1.7.8-2.el8.x86_64", "Installed: libpng-2:1.6.34-10.el8_10.x86_64", "Installed: libgcab1-1.1-1.el8.x86_64", "Installed: grub2-common-1:2.02-170.el8_10.1.noarch", "Installed: nfs-utils-1:2.3.3-68.el8_10.x86_64", "Installed: grub2-efi-x64-1:2.02-170.el8_10.1.x86_64", "Installed: grub2-efi-x64-modules-1:2.02-170.el8_10.1.noarch", "Installed: grub2-pc-1:2.02-170.el8_10.1.x86_64", "Installed: grub2-pc-modules-1:2.02-170.el8_10.1.noarch", "Installed: libgudev-232-4.el8.x86_64", "Installed: grub2-tools-efi-1:2.02-170.el8_10.1.x86_64", "Installed: libgusb-0.3.0-1.el8.x86_64", "Installed: grub2-tools-1:2.02-170.el8_10.1.x86_64", "Installed: grub2-tools-extra-1:2.02-170.el8_10.1.x86_64", "Installed: grub2-tools-minimal-1:2.02-170.el8_10.1.x86_64", "Installed: libxmlb-0.1.15-1.el8.x86_64", "Installed: glibc-2.28-251.el8_10.31.x86_64", "Installed: glibc-common-2.28-251.el8_10.31.x86_64", "Installed: glibc-devel-2.28-251.el8_10.31.x86_64", "Installed: volume_key-libs-0.3.11-6.el8.x86_64", "Installed: glibc-gconv-extra-2.28-251.el8_10.31.x86_64", "Installed: dosfstools-4.1-6.el8.x86_64", "Removed: grub2-common-1:2.02-169.el8_10.noarch", "Removed: grub2-efi-x64-1:2.02-169.el8_10.x86_64", "Removed: grub2-efi-x64-modules-1:2.02-169.el8_10.noarch", "Removed: grub2-pc-1:2.02-169.el8_10.x86_64", "Removed: grub2-pc-modules-1:2.02-169.el8_10.noarch", "Removed: grub2-tools-1:2.02-169.el8_10.x86_64", "Removed: grub2-tools-extra-1:2.02-169.el8_10.x86_64", "Removed: grub2-tools-minimal-1:2.02-169.el8_10.x86_64", "Removed: libnfsidmap-1:2.3.3-66.el8_10.x86_64", "Removed: glibc-2.28-251.el8_10.27.x86_64", "Removed: glibc-common-2.28-251.el8_10.27.x86_64", "Removed: glibc-devel-2.28-251.el8_10.27.x86_64", "Removed: glibc-gconv-extra-2.28-251.el8_10.27.x86_64", "Removed: glibc-headers-2.28-251.el8_10.27.x86_64", "Removed: glibc-langpack-en-2.28-251.el8_10.27.x86_64", "Removed: libpng-2:1.6.34-9.el8_10.x86_64", "Removed: dbxtool-8-5.el8_3.2.x86_64", "Removed: nfs-utils-1:2.3.3-66.el8_10.x86_64"], "results_file": "/root/.ansible_async/j836737144146.7600", "started": 1, "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [infra.leapp.common : update-and-reboot | Reboot when updates applied] **** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/update-and-reboot.yml:10 fatal: [managed-node03]: FAILED! => {"msg": "The field 'timeout' has an invalid value, which includes an undefined variable. The error was: 'leapp_reboot_timeout' is undefined. 'leapp_reboot_timeout' is undefined\n\nThe error appears to be in '/root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/update-and-reboot.yml': line 10, column 3, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n- name: update-and-reboot | Reboot when updates applied\n ^ here\n"} TASK [analysis-leapp | Include custom_local_repos for local_repos_post_analysis] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:70 TASK [infra.leapp.common : custom_local_repos | Remove old /etc/leapp/files/leapp_upgrade_repositories.repo] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:2 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : custom_local_repos | Validate repo definitions have baseurl or metalink] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:9 skipping: [managed-node03] => {"changed": false, "skipped_reason": "No items in the list"} TASK [infra.leapp.common : custom_local_repos | Enable custom upgrade yum repositories] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:16 skipping: [managed-node03] => {"changed": false, "skipped_reason": "No items in the list"} TASK [analysis-leapp | Restore original Satellite activation key] ************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:80 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [analysis-leapp | Copy reports to the controller] ************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:91 TASK [infra.leapp.common : copy_reports_to_controller | Ensure reports directory on controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_reports_to_controller.yml:20 changed: [managed-node03 -> localhost] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/ansible_leapp_analysis_logs_2026-03-18_13-06-35", "secontext": "unconfined_u:object_r:admin_home_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : copy_reports_to_controller | Fetch report files if they exist] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_reports_to_controller.yml:30 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml for managed-node03 => (item=/var/log/leapp/leapp-report.txt) included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml for managed-node03 => (item=/var/log/leapp/leapp-report.json) included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml for managed-node03 => (item=/var/log/leapp/leapp-preupgrade.log) TASK [infra.leapp.common : fetch_file_if_exists | Check if file exists] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:7 ok: [managed-node03] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : fetch_file_if_exists | Copy report file to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:12 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : fetch_file_if_exists | Check if file exists] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:7 ok: [managed-node03] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : fetch_file_if_exists | Copy report file to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:12 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : fetch_file_if_exists | Check if file exists] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:7 ok: [managed-node03] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : fetch_file_if_exists | Copy report file to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:12 skipping: [managed-node03] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [copy_reports_to_controller | Copy log file to the controller] ************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_reports_to_controller.yml:39 TASK [infra.leapp.common : copy_archive_leapp_log | Check for log file] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:11 ok: [managed-node03] => {"changed": false, "stat": {"atime": 1773839201.790571, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "97d170e1550eee4afc0af065b78cda302a97674c", "ctime": 1773839201.791571, "dev": 51715, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 813695113, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773839201.4765701, "nlink": 1, "path": "/var/log/leapp/ansible_leapp_analysis.log", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 2, "uid": 0, "version": "3689487098", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}} TASK [infra.leapp.common : copy_archive_leapp_log | Add end time to log file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:19 changed: [managed-node03] => {"backup": "", "changed": true, "msg": "line added"} TASK [infra.leapp.common : copy_archive_leapp_log | Slurp file /var/log/leapp/ansible_leapp_analysis.log] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:27 ok: [managed-node03] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false} TASK [infra.leapp.common : copy_archive_leapp_log | Decode file /var/log/leapp/ansible_leapp_analysis.log] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:33 ok: [managed-node03] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false} TASK [infra.leapp.common : copy_archive_leapp_log | Ensure reports directory on controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:38 ok: [managed-node03 -> localhost] => {"changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/ansible_leapp_analysis_logs_2026-03-18_13-06-35", "secontext": "unconfined_u:object_r:admin_home_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : copy_archive_leapp_log | Copy ansible leapp log to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:48 changed: [managed-node03] => {"changed": true, "checksum": "fddf59e21f3b79922498380fbd01e51178908107", "dest": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/ansible_leapp_analysis_logs_2026-03-18_13-06-35/managed-node03/ansible_leapp_analysis.log", "md5sum": "b18d1de71ca773cc083d456039144559", "remote_checksum": "fddf59e21f3b79922498380fbd01e51178908107", "remote_md5sum": null} TASK [infra.leapp.common : copy_archive_leapp_log | Copy log file to timestamped location] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:54 changed: [managed-node03] => {"changed": true, "checksum": "fddf59e21f3b79922498380fbd01e51178908107", "dest": "/var/log/leapp/ansible_leapp_analysis_2026-03-18_13-06-35.log", "gid": 0, "group": "root", "md5sum": "b18d1de71ca773cc083d456039144559", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 37, "src": "/var/log/leapp/ansible_leapp_analysis.log", "state": "file", "uid": 0} TASK [infra.leapp.common : copy_archive_leapp_log | Remove original log file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:66 changed: [managed-node03] => {"changed": true, "path": "/var/log/leapp/ansible_leapp_analysis.log", "state": "absent"} TASK [tests_upgrade_custom | Include cleanup logs] ***************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/tests_upgrade_custom.yml:49 TASK [infra.leapp.common : cleanup_logs | Cleanup | Remove log files] ********** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/cleanup_logs.yml:2 changed: [managed-node03] => {"changed": true, "cmd": "set -euxo pipefail\nrm -f /var/log/leapp/*.log\nrm -f /var/log/leapp/*.json\nrm -f /var/log/leapp/*.txt\n", "delta": "0:00:00.005538", "end": "2026-03-18 09:07:55.188398", "msg": "", "rc": 0, "start": "2026-03-18 09:07:55.182860", "stderr": "+ rm -f /var/log/leapp/ansible_leapp_analysis_2026-03-18_13-06-35.log\n+ rm -f '/var/log/leapp/*.json'\n+ rm -f '/var/log/leapp/*.txt'", "stderr_lines": ["+ rm -f /var/log/leapp/ansible_leapp_analysis_2026-03-18_13-06-35.log", "+ rm -f '/var/log/leapp/*.json'", "+ rm -f '/var/log/leapp/*.txt'"], "stdout": "", "stdout_lines": []} PLAY RECAP ********************************************************************* managed-node03 : ok=48 changed=17 unreachable=0 failed=1 skipped=27 rescued=0 ignored=0 -- Logs begin at Wed 2026-03-18 08:57:57 EDT, end at Wed 2026-03-18 09:07:55 EDT. -- Mar 18 09:06:23 managed-node03 sshd[4627]: Accepted publickey for root from 10.31.40.234 port 41912 ssh2: ECDSA SHA256:Vge93M0aHCBgo1IUfcKx6Yq8LKsqsMC5D+QXx8ms+30 Mar 18 09:06:23 managed-node03 systemd-logind[620]: New session 8 of user root. -- Subject: A new session 8 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 8 has been created for the user root. -- -- The leading process of the session is 4627. Mar 18 09:06:23 managed-node03 systemd[1]: Started Session 8 of user root. -- Subject: Unit session-8.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-8.scope has finished starting up. -- -- The start-up result is done. Mar 18 09:06:23 managed-node03 sshd[4627]: pam_unix(sshd:session): session opened for user root by (uid=0) Mar 18 09:06:23 managed-node03 platform-python[4751]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Mar 18 09:06:25 managed-node03 platform-python[4885]: ansible-ansible.builtin.stat Invoked with path=/var/log/leapp/leapp-upgrade.log follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 18 09:06:25 managed-node03 platform-python[4990]: ansible-ansible.legacy.dnf Invoked with name=['leapp-upgrade'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Mar 18 09:06:32 managed-node03 platform-python[5104]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/fstab line=//127.0.0.1/test_remediate_cifs /mnt/cifs cifs username=test,password=test 0 0 state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Mar 18 09:06:33 managed-node03 platform-python[5209]: ansible-infra.leapp.modprobe Invoked with name=dnet state=present params= persistent=disabled Mar 18 09:06:33 managed-node03 platform-python[5319]: ansible-infra.leapp.modprobe Invoked with name=dlci state=present params= persistent=disabled Mar 18 09:06:33 managed-node03 kernel: DLCI driver v0.35, 4 Jan 1997, mike.mclagan@linux.org. Mar 18 09:06:34 managed-node03 platform-python[5429]: ansible-infra.leapp.modprobe Invoked with name=liquidio state=present params= persistent=disabled Mar 18 09:06:36 managed-node03 platform-python[5539]: ansible-ansible.builtin.file Invoked with path=/var/log/leapp state=directory owner=root group=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Mar 18 09:06:36 managed-node03 platform-python[5644]: ansible-ansible.builtin.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 18 09:06:36 managed-node03 platform-python[5749]: ansible-ansible.legacy.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 18 09:06:37 managed-node03 platform-python[5833]: ansible-ansible.legacy.copy Invoked with dest=/var/log/leapp/ansible_leapp_analysis.log owner=root group=root mode=0644 src=/root/.ansible/tmp/ansible-tmp-1773839196.5226629-9756-169623907402954/source _original_basename=tmp51lj721g follow=False checksum=8968b09195716bd817b7dec596693ebe6ed1d959 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Mar 18 09:06:37 managed-node03 platform-python[5940]: ansible-ansible.builtin.file Invoked with path=/etc/ansible/facts.d state=directory mode=0755 owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Mar 18 09:06:37 managed-node03 platform-python[6045]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/pre_ipu.fact follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 18 09:06:38 managed-node03 platform-python[6129]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/pre_ipu.fact mode=0644 owner=root group=root src=/root/.ansible/tmp/ansible-tmp-1773839197.6892102-9809-70308775957701/source _original_basename=tmpq4ajuv0t follow=False checksum=044e7b8239320cf71acb917fcc4ea0c6d22fa2aa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Mar 18 09:06:38 managed-node03 platform-python[6236]: ansible-ansible.legacy.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 18 09:06:38 managed-node03 platform-python[6322]: ansible-ansible.legacy.copy Invoked with dest=/var/log/leapp/ansible_leapp_analysis.log mode=0644 owner=root group=root src=/root/.ansible/tmp/ansible-tmp-1773839198.3519363-9809-182013657867994/source _original_basename=tmp1hcttfib follow=False checksum=044e7b8239320cf71acb917fcc4ea0c6d22fa2aa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Mar 18 09:06:39 managed-node03 platform-python[6429]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\.|+]el8' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Mar 18 09:06:40 managed-node03 platform-python[6539]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/non_rhel_packages.fact follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 18 09:06:41 managed-node03 platform-python[6623]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/non_rhel_packages.fact mode=0644 owner=root group=root src=/root/.ansible/tmp/ansible-tmp-1773839200.466031-9928-98401967209276/source _original_basename=tmpugdhfsf3 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Mar 18 09:06:41 managed-node03 platform-python[6730]: ansible-ansible.legacy.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 18 09:06:41 managed-node03 platform-python[6816]: ansible-ansible.legacy.copy Invoked with dest=/var/log/leapp/ansible_leapp_analysis.log mode=0644 owner=root group=root src=/root/.ansible/tmp/ansible-tmp-1773839201.1323354-9928-221266378357809/source _original_basename=tmp5bt35xk2 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Mar 18 09:06:42 managed-node03 platform-python[6923]: ansible-ansible.legacy.command Invoked with _raw_params=dnf versionlock list _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Mar 18 09:06:43 managed-node03 platform-python[7030]: ansible-ansible.legacy.dnf Invoked with name=['leapp-upgrade'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Mar 18 09:06:46 managed-node03 dbus-daemon[622]: [system] Reloaded configuration Mar 18 09:06:46 managed-node03 dbus-daemon[622]: [system] Reloaded configuration Mar 18 09:06:46 managed-node03 dbus-daemon[622]: [system] Reloaded configuration Mar 18 09:06:46 managed-node03 dbus-daemon[622]: [system] Reloaded configuration Mar 18 09:06:46 managed-node03 dbus-daemon[622]: [system] Reloaded configuration Mar 18 09:06:46 managed-node03 dbus-daemon[622]: [system] Reloaded configuration Mar 18 09:06:46 managed-node03 dbus-daemon[622]: [system] Reloaded configuration Mar 18 09:06:47 managed-node03 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. -- Subject: Unit run-ra32a438f81a34e2ebfd691edf607b67e.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit run-ra32a438f81a34e2ebfd691edf607b67e.service has finished starting up. -- -- The start-up result is done. Mar 18 09:06:47 managed-node03 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 18 09:06:47 managed-node03 systemd[1]: Starting man-db-cache-update.service... -- Subject: Unit man-db-cache-update.service has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit man-db-cache-update.service has begun starting up. Mar 18 09:06:47 managed-node03 systemd[1]: Reloading. Mar 18 09:06:49 managed-node03 ansible-async_wrapper.py[7600]: Invoked with j836737144146 7200 /root/.ansible/tmp/ansible-tmp-1773839208.700524-10205-21460990383720/AnsiballZ_dnf.py _ Mar 18 09:06:49 managed-node03 ansible-async_wrapper.py[7607]: Starting module and watcher Mar 18 09:06:49 managed-node03 ansible-async_wrapper.py[7607]: Start watching 7608 (7200) Mar 18 09:06:49 managed-node03 ansible-async_wrapper.py[7608]: Start module (7608) Mar 18 09:06:49 managed-node03 ansible-async_wrapper.py[7600]: Return async_wrapper task started. Mar 18 09:06:49 managed-node03 systemd[1]: man-db-cache-update.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit man-db-cache-update.service has successfully entered the 'dead' state. Mar 18 09:06:49 managed-node03 systemd[1]: Started man-db-cache-update.service. -- Subject: Unit man-db-cache-update.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit man-db-cache-update.service has finished starting up. -- -- The start-up result is done. Mar 18 09:06:49 managed-node03 systemd[1]: run-ra32a438f81a34e2ebfd691edf607b67e.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit run-ra32a438f81a34e2ebfd691edf607b67e.service has successfully entered the 'dead' state. Mar 18 09:06:49 managed-node03 platform-python[7609]: ansible-ansible.legacy.dnf Invoked with name=['*'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Mar 18 09:06:54 managed-node03 ansible-async_wrapper.py[7607]: 7608 still running (7200) Mar 18 09:06:58 managed-node03 systemd[1]: Stopping Command Scheduler... -- Subject: Unit crond.service has begun shutting down -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit crond.service has begun shutting down. Mar 18 09:06:58 managed-node03 crond[1378]: (CRON) INFO (Shutting down) Mar 18 09:06:58 managed-node03 systemd[1]: crond.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit crond.service has successfully entered the 'dead' state. Mar 18 09:06:58 managed-node03 systemd[1]: Stopped Command Scheduler. -- Subject: Unit crond.service has finished shutting down -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit crond.service has finished shutting down. Mar 18 09:06:58 managed-node03 systemd[1]: crond.service: Found left-over process 4579 (anacron) in control group while starting unit. Ignoring. Mar 18 09:06:58 managed-node03 systemd[1]: This usually indicates unclean termination of a previous run, or service implementation deficiencies. Mar 18 09:06:58 managed-node03 systemd[1]: Started Command Scheduler. -- Subject: Unit crond.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit crond.service has finished starting up. -- -- The start-up result is done. Mar 18 09:06:58 managed-node03 crond[7713]: (CRON) STARTUP (1.5.2) Mar 18 09:06:58 managed-node03 crond[7713]: (CRON) INFO (Syslog will be used instead of sendmail.) Mar 18 09:06:58 managed-node03 crond[7713]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 5% if used.) Mar 18 09:06:58 managed-node03 crond[7713]: (CRON) INFO (running with inotify support) Mar 18 09:06:58 managed-node03 crond[7713]: (CRON) INFO (@reboot jobs will be run at computer's startup.) Mar 18 09:06:59 managed-node03 ansible-async_wrapper.py[7607]: 7608 still running (7195) Mar 18 09:06:59 managed-node03 dbus-daemon[622]: [system] Reloaded configuration Mar 18 09:06:59 managed-node03 dbus-daemon[622]: [system] Reloaded configuration Mar 18 09:06:59 managed-node03 dbus-daemon[622]: [system] Reloaded configuration Mar 18 09:06:59 managed-node03 systemd-udevd[529]: Network interface NamePolicy= disabled on kernel command line, ignoring. Mar 18 09:06:59 managed-node03 systemd-logind[620]: Watching system buttons on /dev/input/event1 (Sleep Button) Mar 18 09:07:00 managed-node03 systemd-logind[620]: Watching system buttons on /dev/input/event0 (Power Button) Mar 18 09:07:00 managed-node03 systemd-udevd[7729]: Using default interface naming scheme 'rhel-8.0'. Mar 18 09:07:00 managed-node03 systemd-logind[620]: Watching system buttons on /dev/input/event2 (AT Translated Set 2 keyboard) Mar 18 09:07:00 managed-node03 systemd[1]: Stopping GSSAPI Proxy Daemon... -- Subject: Unit gssproxy.service has begun shutting down -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has begun shutting down. Mar 18 09:07:00 managed-node03 systemd[1]: gssproxy.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit gssproxy.service has successfully entered the 'dead' state. Mar 18 09:07:00 managed-node03 systemd[1]: Stopped GSSAPI Proxy Daemon. -- Subject: Unit gssproxy.service has finished shutting down -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has finished shutting down. Mar 18 09:07:00 managed-node03 systemd[1]: Starting GSSAPI Proxy Daemon... -- Subject: Unit gssproxy.service has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has begun starting up. Mar 18 09:07:00 managed-node03 systemd[1]: Started GSSAPI Proxy Daemon. -- Subject: Unit gssproxy.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has finished starting up. -- -- The start-up result is done. Mar 18 09:07:00 managed-node03 systemd[1]: Stopping GSSAPI Proxy Daemon... -- Subject: Unit gssproxy.service has begun shutting down -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has begun shutting down. Mar 18 09:07:00 managed-node03 systemd[1]: gssproxy.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit gssproxy.service has successfully entered the 'dead' state. Mar 18 09:07:00 managed-node03 systemd[1]: Stopped GSSAPI Proxy Daemon. -- Subject: Unit gssproxy.service has finished shutting down -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has finished shutting down. Mar 18 09:07:00 managed-node03 systemd[1]: Starting GSSAPI Proxy Daemon... -- Subject: Unit gssproxy.service has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has begun starting up. Mar 18 09:07:00 managed-node03 systemd[1]: Started GSSAPI Proxy Daemon. -- Subject: Unit gssproxy.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has finished starting up. -- -- The start-up result is done. Mar 18 09:07:00 managed-node03 systemd[1]: Stopping GSSAPI Proxy Daemon... -- Subject: Unit gssproxy.service has begun shutting down -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has begun shutting down. Mar 18 09:07:00 managed-node03 systemd[1]: gssproxy.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit gssproxy.service has successfully entered the 'dead' state. Mar 18 09:07:00 managed-node03 systemd[1]: Stopped GSSAPI Proxy Daemon. -- Subject: Unit gssproxy.service has finished shutting down -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has finished shutting down. Mar 18 09:07:00 managed-node03 systemd[1]: Starting GSSAPI Proxy Daemon... -- Subject: Unit gssproxy.service has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has begun starting up. Mar 18 09:07:00 managed-node03 systemd[1]: Started GSSAPI Proxy Daemon. -- Subject: Unit gssproxy.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has finished starting up. -- -- The start-up result is done. Mar 18 09:07:00 managed-node03 dbus-daemon[622]: [system] Reloaded configuration Mar 18 09:07:00 managed-node03 dbus-daemon[622]: [system] Reloaded configuration Mar 18 09:07:01 managed-node03 dbus-daemon[622]: [system] Reloaded configuration Mar 18 09:07:01 managed-node03 dbus-daemon[622]: [system] Reloaded configuration Mar 18 09:07:01 managed-node03 polkitd[949]: Reloading rules Mar 18 09:07:01 managed-node03 polkitd[949]: Collecting garbage unconditionally... Mar 18 09:07:01 managed-node03 polkitd[949]: Loading rules from directory /etc/polkit-1/rules.d Mar 18 09:07:01 managed-node03 polkitd[949]: Loading rules from directory /usr/share/polkit-1/rules.d Mar 18 09:07:01 managed-node03 polkitd[949]: Finished loading, compiling and executing 3 rules Mar 18 09:07:01 managed-node03 polkitd[949]: Reloading rules Mar 18 09:07:01 managed-node03 polkitd[949]: Collecting garbage unconditionally... Mar 18 09:07:01 managed-node03 polkitd[949]: Loading rules from directory /etc/polkit-1/rules.d Mar 18 09:07:01 managed-node03 polkitd[949]: Loading rules from directory /usr/share/polkit-1/rules.d Mar 18 09:07:01 managed-node03 polkitd[949]: Finished loading, compiling and executing 3 rules Mar 18 09:07:01 managed-node03 systemd[1]: Stopped target NFS client services. -- Subject: Unit nfs-client.target has finished shutting down -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit nfs-client.target has finished shutting down. Mar 18 09:07:01 managed-node03 systemd[1]: Stopping NFS client services. -- Subject: Unit nfs-client.target has begun shutting down -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit nfs-client.target has begun shutting down. Mar 18 09:07:01 managed-node03 systemd[1]: Reached target NFS client services. -- Subject: Unit nfs-client.target has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit nfs-client.target has finished starting up. -- -- The start-up result is done. Mar 18 09:07:01 managed-node03 systemd[1]: Reloading. Mar 18 09:07:01 managed-node03 systemd[1]: Reloading. Mar 18 09:07:02 managed-node03 systemd-udevd[529]: Network interface NamePolicy= disabled on kernel command line, ignoring. Mar 18 09:07:02 managed-node03 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. -- Subject: Unit run-r97d607d5314545a58023dc6066e0db30.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit run-r97d607d5314545a58023dc6066e0db30.service has finished starting up. -- -- The start-up result is done. Mar 18 09:07:02 managed-node03 systemd[1]: Starting man-db-cache-update.service... -- Subject: Unit man-db-cache-update.service has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit man-db-cache-update.service has begun starting up. Mar 18 09:07:02 managed-node03 systemd[1]: Reloading. Mar 18 09:07:02 managed-node03 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. -- Subject: Unit run-r22ad839daa7a4f2883590b6dfbe6a4c3.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit run-r22ad839daa7a4f2883590b6dfbe6a4c3.service has finished starting up. -- -- The start-up result is done. Mar 18 09:07:03 managed-node03 ansible-async_wrapper.py[7608]: Module complete (7608) Mar 18 09:07:04 managed-node03 ansible-async_wrapper.py[7607]: Done in kid B. Mar 18 09:07:04 managed-node03 systemd[1]: man-db-cache-update.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit man-db-cache-update.service has successfully entered the 'dead' state. Mar 18 09:07:04 managed-node03 systemd[1]: Started man-db-cache-update.service. -- Subject: Unit man-db-cache-update.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit man-db-cache-update.service has finished starting up. -- -- The start-up result is done. Mar 18 09:07:04 managed-node03 systemd[1]: run-r97d607d5314545a58023dc6066e0db30.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit run-r97d607d5314545a58023dc6066e0db30.service has successfully entered the 'dead' state. Mar 18 09:07:04 managed-node03 systemd[1]: run-r22ad839daa7a4f2883590b6dfbe6a4c3.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit run-r22ad839daa7a4f2883590b6dfbe6a4c3.service has successfully entered the 'dead' state. Mar 18 09:07:48 managed-node03 sshd[4630]: Received disconnect from 10.31.40.234 port 41912:11: disconnected by user Mar 18 09:07:48 managed-node03 sshd[4630]: Disconnected from user root 10.31.40.234 port 41912 Mar 18 09:07:48 managed-node03 sshd[4627]: pam_unix(sshd:session): session closed for user root Mar 18 09:07:48 managed-node03 systemd[1]: session-8.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-8.scope has successfully entered the 'dead' state. Mar 18 09:07:48 managed-node03 systemd-logind[620]: Session 8 logged out. Waiting for processes to exit. Mar 18 09:07:48 managed-node03 systemd-logind[620]: Removed session 8. -- Subject: Session 8 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 8 has been terminated. Mar 18 09:07:49 managed-node03 sshd[10088]: Accepted publickey for root from 10.31.40.234 port 51336 ssh2: ECDSA SHA256:Vge93M0aHCBgo1IUfcKx6Yq8LKsqsMC5D+QXx8ms+30 Mar 18 09:07:49 managed-node03 systemd[1]: Started Session 9 of user root. -- Subject: Unit session-9.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-9.scope has finished starting up. -- -- The start-up result is done. Mar 18 09:07:49 managed-node03 systemd-logind[620]: New session 9 of user root. -- Subject: A new session 9 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 9 has been created for the user root. -- -- The leading process of the session is 10088. Mar 18 09:07:49 managed-node03 sshd[10088]: pam_unix(sshd:session): session opened for user root by (uid=0) Mar 18 09:07:49 managed-node03 platform-python[10194]: ansible-ansible.legacy.async_status Invoked with jid=j836737144146.7600 mode=status _async_dir=/root/.ansible_async Mar 18 09:07:50 managed-node03 platform-python[10262]: ansible-ansible.legacy.async_status Invoked with jid=j836737144146.7600 mode=cleanup _async_dir=/root/.ansible_async Mar 18 09:07:50 managed-node03 platform-python[10367]: ansible-ansible.builtin.stat Invoked with path=/var/log/leapp/leapp-report.txt follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 18 09:07:51 managed-node03 platform-python[10472]: ansible-ansible.builtin.stat Invoked with path=/var/log/leapp/leapp-report.json follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 18 09:07:51 managed-node03 platform-python[10577]: ansible-ansible.builtin.stat Invoked with path=/var/log/leapp/leapp-preupgrade.log follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 18 09:07:52 managed-node03 platform-python[10682]: ansible-ansible.builtin.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 18 09:07:52 managed-node03 platform-python[10789]: ansible-ansible.builtin.lineinfile Invoked with path=/var/log/leapp/ansible_leapp_analysis.log line=Job ended at 2026-03-18T13:07:52Z owner=root group=root mode=0644 state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None seuser=None serole=None selevel=None setype=None attributes=None Mar 18 09:07:53 managed-node03 platform-python[10999]: ansible-ansible.legacy.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=True get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 18 09:07:54 managed-node03 platform-python[11121]: ansible-ansible.legacy.copy Invoked with src=/var/log/leapp/ansible_leapp_analysis.log dest=/var/log/leapp/ansible_leapp_analysis_2026-03-18_13-06-35.log remote_src=True mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Mar 18 09:07:54 managed-node03 platform-python[11228]: ansible-ansible.builtin.file Invoked with path=/var/log/leapp/ansible_leapp_analysis.log state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Mar 18 09:07:55 managed-node03 platform-python[11333]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail rm -f /var/log/leapp/*.log rm -f /var/log/leapp/*.json rm -f /var/log/leapp/*.txt _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Mar 18 09:07:55 managed-node03 sshd[11354]: Accepted publickey for root from 10.31.40.234 port 39970 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Mar 18 09:07:55 managed-node03 systemd[1]: Started Session 10 of user root. -- Subject: Unit session-10.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-10.scope has finished starting up. -- -- The start-up result is done. Mar 18 09:07:55 managed-node03 systemd-logind[620]: New session 10 of user root. -- Subject: A new session 10 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 10 has been created for the user root. -- -- The leading process of the session is 11354. Mar 18 09:07:55 managed-node03 sshd[11354]: pam_unix(sshd:session): session opened for user root by (uid=0) Mar 18 09:07:55 managed-node03 sshd[11357]: Received disconnect from 10.31.40.234 port 39970:11: disconnected by user Mar 18 09:07:55 managed-node03 sshd[11357]: Disconnected from user root 10.31.40.234 port 39970 Mar 18 09:07:55 managed-node03 sshd[11354]: pam_unix(sshd:session): session closed for user root Mar 18 09:07:55 managed-node03 systemd[1]: session-10.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-10.scope has successfully entered the 'dead' state. Mar 18 09:07:55 managed-node03 systemd-logind[620]: Session 10 logged out. Waiting for processes to exit. Mar 18 09:07:55 managed-node03 systemd-logind[620]: Removed session 10. -- Subject: Session 10 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 10 has been terminated. Mar 18 09:07:55 managed-node03 sshd[11375]: Accepted publickey for root from 10.31.40.234 port 39974 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Mar 18 09:07:55 managed-node03 systemd[1]: Started Session 11 of user root. -- Subject: Unit session-11.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-11.scope has finished starting up. -- -- The start-up result is done. Mar 18 09:07:55 managed-node03 systemd-logind[620]: New session 11 of user root. -- Subject: A new session 11 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 11 has been created for the user root. -- -- The leading process of the session is 11375. Mar 18 09:07:55 managed-node03 sshd[11375]: pam_unix(sshd:session): session opened for user root by (uid=0)