[WARNING]: Collection infra.leapp does not support Ansible version 2.14.18 [WARNING]: running playbook inside collection infra.leapp ansible-playbook [core 2.14.18] config file = /etc/ansible/ansible.cfg configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python3.9/site-packages/ansible ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections executable location = /usr/bin/ansible-playbook python version = 3.9.25 (main, Mar 9 2026, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-14)] (/usr/bin/python3) jinja version = 3.1.2 libyaml = True Using /etc/ansible/ansible.cfg as config file Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_default.yml **************************************************** 1 plays in /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/tests_default.yml PLAY [Test] ******************************************************************** TASK [Gathering Facts] ********************************************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/tests_default.yml:2 ok: [managed-node01] TASK [Test | Run role analysis] ************************************************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/tests_default.yml:9 TASK [infra.leapp.analysis : Lock timestamped variables] *********************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:5 ok: [managed-node01] => {"ansible_facts": {"__leapp_timestamp": "2026-03-19_14-34-08"}, "changed": false} TASK [Initialize lock, logging, and common vars] ******************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:9 TASK [infra.leapp.common : init_leapp_log | Ensure that log directory exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:9 changed: [managed-node01] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/var/log/leapp", "secontext": "unconfined_u:object_r:var_log_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : init_leapp_log | Check for existing log file] ******* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:17 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : init_leapp_log | Fail if log file already exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:22 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : init_leapp_log | Create new log file] *************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:31 changed: [managed-node01] => {"changed": true, "checksum": "8004fee34f100867a341c60fcc78b4a5e2844e42", "dest": "/var/log/leapp/ansible_leapp_analysis.log", "gid": 0, "group": "root", "md5sum": "a2523d38f99e2ab228315752c300bfa6", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 70, "src": "/root/.ansible/tmp/ansible-tmp-1773930849.93014-5729-30970600728414/source", "state": "file", "uid": 0} TASK [infra.leapp.common : init_leapp_log | /etc/ansible/facts.d directory exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:41 changed: [managed-node01] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/ansible/facts.d", "secontext": "unconfined_u:object_r:etc_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : init_leapp_log | Capture current ansible_facts for validation after upgrade] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:49 changed: [managed-node01] => (item=/etc/ansible/facts.d/pre_ipu.fact) => {"ansible_loop_var": "item", "changed": true, "checksum": "16effcf253d544decdb944b12b4df39ac8a326a8", "dest": "/etc/ansible/facts.d/pre_ipu.fact", "gid": 0, "group": "root", "item": "/etc/ansible/facts.d/pre_ipu.fact", "md5sum": "825cb8e71c2ca3665bca645937a221b3", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 14665, "src": "/root/.ansible/tmp/ansible-tmp-1773930851.1490128-5760-19110018209395/source", "state": "file", "uid": 0} changed: [managed-node01] => (item=/var/log/leapp/ansible_leapp_analysis.log) => {"ansible_loop_var": "item", "changed": true, "checksum": "16effcf253d544decdb944b12b4df39ac8a326a8", "dest": "/var/log/leapp/ansible_leapp_analysis.log", "gid": 0, "group": "root", "item": "/var/log/leapp/ansible_leapp_analysis.log", "md5sum": "825cb8e71c2ca3665bca645937a221b3", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 14665, "src": "/root/.ansible/tmp/ansible-tmp-1773930851.8219929-5760-250312251780085/source", "state": "file", "uid": 0} TASK [infra.leapp.common : init_leapp_log | Capture a list of non-rhel versioned packages] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:60 ok: [managed-node01] => {"changed": false, "cmd": "set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\\.|+]el9' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort", "delta": "0:00:00.176646", "end": "2026-03-19 10:34:13.138590", "failed_when_result": false, "msg": "non-zero return code", "rc": 1, "start": "2026-03-19 10:34:12.961944", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [infra.leapp.common : init_leapp_log | Create fact with the non-rhel versioned packages list] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:74 ok: [managed-node01] => {"ansible_facts": {"non_rhel_packages": []}, "changed": false} TASK [infra.leapp.common : init_leapp_log | Capture the list of non-rhel versioned packages in a separate fact file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:78 changed: [managed-node01] => (item=/etc/ansible/facts.d/non_rhel_packages.fact) => {"ansible_loop_var": "item", "changed": true, "checksum": "97d170e1550eee4afc0af065b78cda302a97674c", "dest": "/etc/ansible/facts.d/non_rhel_packages.fact", "gid": 0, "group": "root", "item": "/etc/ansible/facts.d/non_rhel_packages.fact", "md5sum": "d751713988987e9331980363e24189ce", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 2, "src": "/root/.ansible/tmp/ansible-tmp-1773930853.267429-5807-11508480713562/source", "state": "file", "uid": 0} changed: [managed-node01] => (item=/var/log/leapp/ansible_leapp_analysis.log) => {"ansible_loop_var": "item", "changed": true, "checksum": "97d170e1550eee4afc0af065b78cda302a97674c", "dest": "/var/log/leapp/ansible_leapp_analysis.log", "gid": 0, "group": "root", "item": "/var/log/leapp/ansible_leapp_analysis.log", "md5sum": "d751713988987e9331980363e24189ce", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 2, "src": "/root/.ansible/tmp/ansible-tmp-1773930853.9279163-5807-251082185832833/source", "state": "file", "uid": 0} TASK [infra.leapp.analysis : Include tasks for preupg assistant analysis] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:19 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : Include tasks for leapp preupgrade analysis] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:23 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml for managed-node01 TASK [analysis-leapp | Include pre_upgrade_update.yml] ************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:4 TASK [infra.leapp.common : pre_upgrade_update | Register with Satellite activation key] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade_update.yml:3 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [pre_upgrade_update | Include custom_local_repos for local_repos_pre_leapp] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade_update.yml:10 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : pre_upgrade_update | Get package version lock entries] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade_update.yml:21 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : pre_upgrade_update | Remove all package version locks] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade_update.yml:28 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : pre_upgrade_update | Install packages for upgrade from RHEL 7] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade_update.yml:36 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : pre_upgrade_update | Include update-and-reboot.yml] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade_update.yml:45 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : analysis-leapp | Ensure leapp log directory exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:11 changed: [managed-node01] => {"changed": true, "gid": 0, "group": "root", "mode": "0700", "owner": "root", "path": "/var/log/leapp", "secontext": "unconfined_u:object_r:var_log_t:s0", "size": 40, "state": "directory", "uid": 0} TASK [infra.leapp.analysis : analysis-leapp | Populate leapp_answers file] ***** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:19 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [analysis-leapp | Create /etc/leapp/files/leapp_upgrade_repositories.repo] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:28 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : analysis-leapp | Leapp preupgrade report] ********* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:39 ASYNC FAILED on managed-node01: jid=j750420252764.7378 fatal: [managed-node01]: FAILED! => {"ansible_job_id": "j750420252764.7378", "changed": true, "cmd": "set -o pipefail; export PATH=$PATH; ulimit -n 16384; leapp preupgrade --report-schema=1.2.0 2>&1 | tee -a /var/log/leapp/ansible_leapp_analysis.log\n", "delta": "0:00:00.004178", "end": "2026-03-19 10:34:15.879547", "failed_when_result": true, "finished": 1, "msg": "non-zero return code", "rc": 127, "results_file": "/root/.ansible_async/j750420252764.7378", "start": "2026-03-19 10:34:15.875369", "started": 1, "stderr": "", "stderr_lines": [], "stdout": "/bin/bash: line 1: leapp: command not found", "stdout_lines": ["/bin/bash: line 1: leapp: command not found"]} TASK [analysis-leapp | Include custom_local_repos for local_repos_post_analysis] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:70 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [analysis-leapp | Restore original Satellite activation key] ************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:80 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [analysis-leapp | Copy reports to the controller] ************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:91 TASK [infra.leapp.common : copy_reports_to_controller | Ensure reports directory on controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_reports_to_controller.yml:20 changed: [managed-node01 -> localhost] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/ansible_leapp_analysis_logs_2026-03-19_14-34-08", "secontext": "unconfined_u:object_r:admin_home_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : copy_reports_to_controller | Fetch report files if they exist] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_reports_to_controller.yml:30 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml for managed-node01 => (item=/var/log/leapp/leapp-report.txt) included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml for managed-node01 => (item=/var/log/leapp/leapp-report.json) included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml for managed-node01 => (item=/var/log/leapp/leapp-preupgrade.log) TASK [infra.leapp.common : fetch_file_if_exists | Check if file exists] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:7 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : fetch_file_if_exists | Copy report file to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:12 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : fetch_file_if_exists | Check if file exists] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:7 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : fetch_file_if_exists | Copy report file to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:12 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : fetch_file_if_exists | Check if file exists] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:7 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : fetch_file_if_exists | Copy report file to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:12 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [copy_reports_to_controller | Copy log file to the controller] ************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_reports_to_controller.yml:39 TASK [infra.leapp.common : copy_archive_leapp_log | Check for log file] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:11 ok: [managed-node01] => {"changed": false, "stat": {"atime": 1773930854.501163, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "6e290f8d95b1dcae170f6df094742e47fc4a2ec1", "ctime": 1773930855.8781629, "dev": 51716, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 654311558, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/json", "mode": "0644", "mtime": 1773930855.8781629, "nlink": 1, "path": "/var/log/leapp/ansible_leapp_analysis.log", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 46, "uid": 0, "version": "2087409440", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}} TASK [infra.leapp.common : copy_archive_leapp_log | Add end time to log file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:19 changed: [managed-node01] => {"backup": "", "changed": true, "msg": "line added"} TASK [infra.leapp.common : copy_archive_leapp_log | Slurp file /var/log/leapp/ansible_leapp_analysis.log] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:27 ok: [managed-node01] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false} TASK [infra.leapp.common : copy_archive_leapp_log | Decode file /var/log/leapp/ansible_leapp_analysis.log] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:33 ok: [managed-node01] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false} TASK [infra.leapp.common : copy_archive_leapp_log | Ensure reports directory on controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:38 ok: [managed-node01 -> localhost] => {"changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/ansible_leapp_analysis_logs_2026-03-19_14-34-08", "secontext": "unconfined_u:object_r:admin_home_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : copy_archive_leapp_log | Copy ansible leapp log to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:48 changed: [managed-node01] => {"changed": true, "checksum": "097fe63b1a9030e8af842cfd097def05885b2aa7", "dest": "/root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/ansible_leapp_analysis_logs_2026-03-19_14-34-08/managed-node01/ansible_leapp_analysis.log", "md5sum": "b320dd4f8c258d697b64b1b42f9dde67", "remote_checksum": "097fe63b1a9030e8af842cfd097def05885b2aa7", "remote_md5sum": null} TASK [infra.leapp.common : copy_archive_leapp_log | Copy log file to timestamped location] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:54 changed: [managed-node01] => {"changed": true, "checksum": "097fe63b1a9030e8af842cfd097def05885b2aa7", "dest": "/var/log/leapp/ansible_leapp_analysis_2026-03-19_14-34-08.log", "gid": 0, "group": "root", "md5sum": "b320dd4f8c258d697b64b1b42f9dde67", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 80, "src": "/var/log/leapp/ansible_leapp_analysis.log", "state": "file", "uid": 0} TASK [infra.leapp.common : copy_archive_leapp_log | Remove original log file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:66 changed: [managed-node01] => {"changed": true, "path": "/var/log/leapp/ansible_leapp_analysis.log", "state": "absent"} TASK [Test | Include cleanup logs] ********************************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/tests_default.yml:13 TASK [infra.leapp.common : cleanup_logs | Cleanup | Remove log files] ********** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/cleanup_logs.yml:2 changed: [managed-node01] => {"changed": true, "cmd": "set -euxo pipefail\nrm -f /var/log/leapp/*.log\nrm -f /var/log/leapp/*.json\nrm -f /var/log/leapp/*.txt\n", "delta": "0:00:00.005233", "end": "2026-03-19 10:35:21.261675", "msg": "", "rc": 0, "start": "2026-03-19 10:35:21.256442", "stderr": "+ rm -f /var/log/leapp/ansible_leapp_analysis_2026-03-19_14-34-08.log\n+ rm -f '/var/log/leapp/*.json'\n+ rm -f '/var/log/leapp/*.txt'", "stderr_lines": ["+ rm -f /var/log/leapp/ansible_leapp_analysis_2026-03-19_14-34-08.log", "+ rm -f '/var/log/leapp/*.json'", "+ rm -f '/var/log/leapp/*.txt'"], "stdout": "", "stdout_lines": []} PLAY RECAP ********************************************************************* managed-node01 : ok=28 changed=12 unreachable=0 failed=1 skipped=15 rescued=0 ignored=0 Mar 19 10:34:06 managed-node01 sshd[4873]: Accepted publickey for root from 10.31.15.84 port 35440 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Mar 19 10:34:06 managed-node01 systemd-logind[608]: New session 6 of user root. ░░ Subject: A new session 6 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 6 has been created for the user root. ░░ ░░ The leading process of the session is 4873. Mar 19 10:34:06 managed-node01 systemd[1]: Started Session 6 of User root. ░░ Subject: A start job for unit session-6.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-6.scope has finished successfully. ░░ ░░ The job identifier is 1154. Mar 19 10:34:06 managed-node01 sshd[4873]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Mar 19 10:34:06 managed-node01 sshd[4876]: Received disconnect from 10.31.15.84 port 35440:11: disconnected by user Mar 19 10:34:06 managed-node01 sshd[4876]: Disconnected from user root 10.31.15.84 port 35440 Mar 19 10:34:06 managed-node01 sshd[4873]: pam_unix(sshd:session): session closed for user root Mar 19 10:34:06 managed-node01 systemd[1]: session-6.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-6.scope has successfully entered the 'dead' state. Mar 19 10:34:06 managed-node01 systemd-logind[608]: Session 6 logged out. Waiting for processes to exit. Mar 19 10:34:06 managed-node01 systemd-logind[608]: Removed session 6. ░░ Subject: Session 6 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 6 has been terminated. Mar 19 10:34:07 managed-node01 sshd[4902]: Accepted publickey for root from 10.31.15.84 port 35452 ssh2: ECDSA SHA256:5dKg62FZTxyDk+oDA3dCp86Ela2X33u4kD8Rv9RzRYE Mar 19 10:34:07 managed-node01 systemd-logind[608]: New session 7 of user root. ░░ Subject: A new session 7 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 7 has been created for the user root. ░░ ░░ The leading process of the session is 4902. Mar 19 10:34:07 managed-node01 systemd[1]: Started Session 7 of User root. ░░ Subject: A start job for unit session-7.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-7.scope has finished successfully. ░░ ░░ The job identifier is 1237. Mar 19 10:34:07 managed-node01 sshd[4902]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Mar 19 10:34:08 managed-node01 python3[5079]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Mar 19 10:34:09 managed-node01 python3[5256]: ansible-ansible.builtin.file Invoked with path=/var/log/leapp state=directory owner=root group=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:34:09 managed-node01 python3[5405]: ansible-ansible.builtin.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 19 10:34:10 managed-node01 python3[5554]: ansible-ansible.legacy.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 19 10:34:10 managed-node01 python3[5674]: ansible-ansible.legacy.copy Invoked with dest=/var/log/leapp/ansible_leapp_analysis.log owner=root group=root mode=0644 src=/root/.ansible/tmp/ansible-tmp-1773930849.93014-5729-30970600728414/source _original_basename=tmp8wgyq01q follow=False checksum=8004fee34f100867a341c60fcc78b4a5e2844e42 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:34:11 managed-node01 python3[5823]: ansible-ansible.builtin.file Invoked with path=/etc/ansible/facts.d state=directory mode=0755 owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:34:11 managed-node01 python3[5972]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/pre_ipu.fact follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 19 10:34:11 managed-node01 python3[6092]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/pre_ipu.fact mode=0644 owner=root group=root src=/root/.ansible/tmp/ansible-tmp-1773930851.1490128-5760-19110018209395/source _original_basename=tmp7s4tjb7t follow=False checksum=16effcf253d544decdb944b12b4df39ac8a326a8 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:34:12 managed-node01 python3[6241]: ansible-ansible.legacy.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 19 10:34:12 managed-node01 python3[6363]: ansible-ansible.legacy.copy Invoked with dest=/var/log/leapp/ansible_leapp_analysis.log mode=0644 owner=root group=root src=/root/.ansible/tmp/ansible-tmp-1773930851.8219929-5760-250312251780085/source _original_basename=tmp2370e2r8 follow=False checksum=16effcf253d544decdb944b12b4df39ac8a326a8 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:34:12 managed-node01 python3[6512]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\.|+]el9' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Mar 19 10:34:13 managed-node01 python3[6666]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/non_rhel_packages.fact follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 19 10:34:13 managed-node01 python3[6786]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/non_rhel_packages.fact mode=0644 owner=root group=root src=/root/.ansible/tmp/ansible-tmp-1773930853.267429-5807-11508480713562/source _original_basename=tmpleisxp_e follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:34:14 managed-node01 python3[6935]: ansible-ansible.legacy.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 19 10:34:14 managed-node01 python3[7057]: ansible-ansible.legacy.copy Invoked with dest=/var/log/leapp/ansible_leapp_analysis.log mode=0644 owner=root group=root src=/root/.ansible/tmp/ansible-tmp-1773930853.9279163-5807-251082185832833/source _original_basename=tmpshu6qh8o follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:34:15 managed-node01 python3[7206]: ansible-ansible.builtin.file Invoked with path=/var/log/leapp state=directory owner=root group=root mode=0700 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:34:15 managed-node01 ansible-async_wrapper.py[7378]: Invoked with j750420252764 7200 /root/.ansible/tmp/ansible-tmp-1773930855.294402-5874-13138269313842/AnsiballZ_command.py _ Mar 19 10:34:15 managed-node01 ansible-async_wrapper.py[7381]: Starting module and watcher Mar 19 10:34:15 managed-node01 ansible-async_wrapper.py[7381]: Start watching 7382 (7200) Mar 19 10:34:15 managed-node01 ansible-async_wrapper.py[7382]: Start module (7382) Mar 19 10:34:15 managed-node01 ansible-async_wrapper.py[7378]: Return async_wrapper task started. Mar 19 10:34:15 managed-node01 python3[7383]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=set -o pipefail; export PATH=$PATH; ulimit -n 16384; leapp preupgrade --report-schema=1.2.0 2>&1 | tee -a /var/log/leapp/ansible_leapp_analysis.log _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None Mar 19 10:34:15 managed-node01 ansible-async_wrapper.py[7382]: Module complete (7382) Mar 19 10:34:20 managed-node01 ansible-async_wrapper.py[7381]: Done in kid B. Mar 19 10:35:15 managed-node01 sshd[4905]: Received disconnect from 10.31.15.84 port 35452:11: disconnected by user Mar 19 10:35:15 managed-node01 sshd[4905]: Disconnected from user root 10.31.15.84 port 35452 Mar 19 10:35:15 managed-node01 sshd[4902]: pam_unix(sshd:session): session closed for user root Mar 19 10:35:15 managed-node01 systemd-logind[608]: Session 7 logged out. Waiting for processes to exit. Mar 19 10:35:15 managed-node01 systemd[1]: session-7.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-7.scope has successfully entered the 'dead' state. Mar 19 10:35:15 managed-node01 systemd[1]: session-7.scope: Consumed 5.459s CPU time. ░░ Subject: Resources consumed by unit runtime ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-7.scope completed and consumed the indicated resources. Mar 19 10:35:15 managed-node01 systemd-logind[608]: Removed session 7. ░░ Subject: Session 7 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 7 has been terminated. Mar 19 10:35:15 managed-node01 sshd[7389]: Accepted publickey for root from 10.31.15.84 port 47080 ssh2: ECDSA SHA256:5dKg62FZTxyDk+oDA3dCp86Ela2X33u4kD8Rv9RzRYE Mar 19 10:35:15 managed-node01 systemd-logind[608]: New session 8 of user root. ░░ Subject: A new session 8 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 8 has been created for the user root. ░░ ░░ The leading process of the session is 7389. Mar 19 10:35:15 managed-node01 systemd[1]: Started Session 8 of User root. ░░ Subject: A start job for unit session-8.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-8.scope has finished successfully. ░░ ░░ The job identifier is 1320. Mar 19 10:35:15 managed-node01 sshd[7389]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Mar 19 10:35:16 managed-node01 python3[7540]: ansible-ansible.legacy.async_status Invoked with jid=j750420252764.7378 mode=status _async_dir=/root/.ansible_async Mar 19 10:35:16 managed-node01 python3[7636]: ansible-ansible.legacy.async_status Invoked with jid=j750420252764.7378 mode=cleanup _async_dir=/root/.ansible_async Mar 19 10:35:17 managed-node01 python3[7785]: ansible-ansible.builtin.stat Invoked with path=/var/log/leapp/leapp-report.txt follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 19 10:35:17 managed-node01 python3[7934]: ansible-ansible.builtin.stat Invoked with path=/var/log/leapp/leapp-report.json follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 19 10:35:18 managed-node01 python3[8083]: ansible-ansible.builtin.stat Invoked with path=/var/log/leapp/leapp-preupgrade.log follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 19 10:35:18 managed-node01 python3[8232]: ansible-ansible.builtin.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 19 10:35:18 managed-node01 python3[8383]: ansible-ansible.builtin.lineinfile Invoked with path=/var/log/leapp/ansible_leapp_analysis.log line=Job ended at 2026-03-19T14:35:18Z owner=root group=root mode=0644 state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:35:20 managed-node01 python3[8681]: ansible-ansible.legacy.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=True get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 19 10:35:20 managed-node01 python3[8855]: ansible-ansible.legacy.copy Invoked with src=/var/log/leapp/ansible_leapp_analysis.log dest=/var/log/leapp/ansible_leapp_analysis_2026-03-19_14-34-08.log remote_src=True mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:35:20 managed-node01 python3[9004]: ansible-ansible.builtin.file Invoked with path=/var/log/leapp/ansible_leapp_analysis.log state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:35:21 managed-node01 python3[9153]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail rm -f /var/log/leapp/*.log rm -f /var/log/leapp/*.json rm -f /var/log/leapp/*.txt _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Mar 19 10:35:21 managed-node01 sshd[9181]: Accepted publickey for root from 10.31.15.84 port 44572 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Mar 19 10:35:21 managed-node01 systemd-logind[608]: New session 9 of user root. ░░ Subject: A new session 9 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 9 has been created for the user root. ░░ ░░ The leading process of the session is 9181. Mar 19 10:35:21 managed-node01 systemd[1]: Started Session 9 of User root. ░░ Subject: A start job for unit session-9.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-9.scope has finished successfully. ░░ ░░ The job identifier is 1403. Mar 19 10:35:21 managed-node01 sshd[9181]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Mar 19 10:35:21 managed-node01 sshd[9184]: Received disconnect from 10.31.15.84 port 44572:11: disconnected by user Mar 19 10:35:21 managed-node01 sshd[9184]: Disconnected from user root 10.31.15.84 port 44572 Mar 19 10:35:21 managed-node01 sshd[9181]: pam_unix(sshd:session): session closed for user root Mar 19 10:35:21 managed-node01 systemd[1]: session-9.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-9.scope has successfully entered the 'dead' state. Mar 19 10:35:21 managed-node01 systemd-logind[608]: Session 9 logged out. Waiting for processes to exit. Mar 19 10:35:21 managed-node01 systemd-logind[608]: Removed session 9. ░░ Subject: Session 9 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 9 has been terminated. Mar 19 10:35:21 managed-node01 sshd[9209]: Accepted publickey for root from 10.31.15.84 port 44578 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Mar 19 10:35:21 managed-node01 systemd-logind[608]: New session 10 of user root. ░░ Subject: A new session 10 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 10 has been created for the user root. ░░ ░░ The leading process of the session is 9209. Mar 19 10:35:21 managed-node01 systemd[1]: Started Session 10 of User root. ░░ Subject: A start job for unit session-10.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-10.scope has finished successfully. ░░ ░░ The job identifier is 1486. Mar 19 10:35:21 managed-node01 sshd[9209]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0)