[WARNING]: Collection infra.leapp does not support Ansible version 2.14.18 [WARNING]: running playbook inside collection infra.leapp ansible-playbook [core 2.14.18] config file = /etc/ansible/ansible.cfg configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python3.9/site-packages/ansible ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections executable location = /usr/bin/ansible-playbook python version = 3.9.25 (main, Mar 9 2026, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-14)] (/usr/bin/python3) jinja version = 3.1.2 libyaml = True Using /etc/ansible/ansible.cfg as config file Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_default.yml **************************************************** 1 plays in /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/tests_default.yml PLAY [Test] ******************************************************************** TASK [Gathering Facts] ********************************************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/tests_default.yml:2 ok: [managed-node01] TASK [Test | Run role analysis] ************************************************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/tests_default.yml:9 TASK [infra.leapp.analysis : Lock timestamped variables] *********************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:5 ok: [managed-node01] => {"ansible_facts": {"__leapp_timestamp": "2026-03-18_13-01-55"}, "changed": false} TASK [Initialize lock, logging, and common vars] ******************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:9 TASK [infra.leapp.common : init_leapp_log | Ensure that log directory exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:9 changed: [managed-node01] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/var/log/leapp", "secontext": "unconfined_u:object_r:var_log_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : init_leapp_log | Check for existing log file] ******* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:17 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : init_leapp_log | Fail if log file already exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:22 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : init_leapp_log | Create new log file] *************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:31 changed: [managed-node01] => {"changed": true, "checksum": "9b939dd04ee0ee57ece2b51ab712679728192e47", "dest": "/var/log/leapp/ansible_leapp_analysis.log", "gid": 0, "group": "root", "md5sum": "393c8bb7847e391634d77c26d0e71bfd", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 70, "src": "/root/.ansible/tmp/ansible-tmp-1773838916.8280566-5744-269004054828565/source", "state": "file", "uid": 0} TASK [infra.leapp.common : init_leapp_log | /etc/ansible/facts.d directory exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:41 changed: [managed-node01] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/ansible/facts.d", "secontext": "unconfined_u:object_r:etc_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : init_leapp_log | Capture current ansible_facts for validation after upgrade] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:49 changed: [managed-node01] => (item=/etc/ansible/facts.d/pre_ipu.fact) => {"ansible_loop_var": "item", "changed": true, "checksum": "fb7dd957f444ef9a7db8fd28f040bb66ad7fe777", "dest": "/etc/ansible/facts.d/pre_ipu.fact", "gid": 0, "group": "root", "item": "/etc/ansible/facts.d/pre_ipu.fact", "md5sum": "3d1fd6a0a834637d2a547711683ce8e8", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 13842, "src": "/root/.ansible/tmp/ansible-tmp-1773838917.9049356-5772-236862838640271/source", "state": "file", "uid": 0} changed: [managed-node01] => (item=/var/log/leapp/ansible_leapp_analysis.log) => {"ansible_loop_var": "item", "changed": true, "checksum": "fb7dd957f444ef9a7db8fd28f040bb66ad7fe777", "dest": "/var/log/leapp/ansible_leapp_analysis.log", "gid": 0, "group": "root", "item": "/var/log/leapp/ansible_leapp_analysis.log", "md5sum": "3d1fd6a0a834637d2a547711683ce8e8", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 13842, "src": "/root/.ansible/tmp/ansible-tmp-1773838918.5316932-5772-181941647262429/source", "state": "file", "uid": 0} TASK [infra.leapp.common : init_leapp_log | Capture a list of non-rhel versioned packages] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:60 ok: [managed-node01] => {"changed": false, "cmd": "set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\\.|+]el8' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort", "delta": "0:00:00.853969", "end": "2026-03-18 09:02:00.399700", "failed_when_result": false, "msg": "non-zero return code", "rc": 1, "start": "2026-03-18 09:01:59.545731", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [infra.leapp.common : init_leapp_log | Create fact with the non-rhel versioned packages list] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:74 ok: [managed-node01] => {"ansible_facts": {"non_rhel_packages": []}, "changed": false} TASK [infra.leapp.common : init_leapp_log | Capture the list of non-rhel versioned packages in a separate fact file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:78 changed: [managed-node01] => (item=/etc/ansible/facts.d/non_rhel_packages.fact) => {"ansible_loop_var": "item", "changed": true, "checksum": "97d170e1550eee4afc0af065b78cda302a97674c", "dest": "/etc/ansible/facts.d/non_rhel_packages.fact", "gid": 0, "group": "root", "item": "/etc/ansible/facts.d/non_rhel_packages.fact", "md5sum": "d751713988987e9331980363e24189ce", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 2, "src": "/root/.ansible/tmp/ansible-tmp-1773838920.5072963-5822-178778724132474/source", "state": "file", "uid": 0} changed: [managed-node01] => (item=/var/log/leapp/ansible_leapp_analysis.log) => {"ansible_loop_var": "item", "changed": true, "checksum": "97d170e1550eee4afc0af065b78cda302a97674c", "dest": "/var/log/leapp/ansible_leapp_analysis.log", "gid": 0, "group": "root", "item": "/var/log/leapp/ansible_leapp_analysis.log", "md5sum": "d751713988987e9331980363e24189ce", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 2, "src": "/root/.ansible/tmp/ansible-tmp-1773838921.111843-5822-175046227805094/source", "state": "file", "uid": 0} TASK [infra.leapp.analysis : Include tasks for preupg assistant analysis] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:19 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : Include tasks for leapp preupgrade analysis] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:23 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml for managed-node01 TASK [analysis-leapp | Include pre_upgrade.yml] ******************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:4 TASK [infra.leapp.common : pre_upgrade | Register with Satellite activation key] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade.yml:3 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [pre_upgrade | Include custom_local_repos for local_repos_pre_leapp] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade.yml:10 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : pre_upgrade | Get package version lock entries] ***** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade.yml:21 ok: [managed-node01] => {"changed": false, "cmd": ["dnf", "versionlock", "list"], "delta": "0:00:00.554780", "end": "2026-03-18 09:02:02.740262", "failed_when_result": false, "msg": "non-zero return code", "rc": 1, "start": "2026-03-18 09:02:02.185482", "stderr": "No such command: versionlock. Please use /usr/bin/dnf --help\nIt could be a DNF plugin command, try: \"dnf install 'dnf-command(versionlock)'\"", "stderr_lines": ["No such command: versionlock. Please use /usr/bin/dnf --help", "It could be a DNF plugin command, try: \"dnf install 'dnf-command(versionlock)'\""], "stdout": "", "stdout_lines": []} TASK [infra.leapp.common : pre_upgrade | Remove all package version locks] ***** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade.yml:28 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : pre_upgrade | Install packages for upgrade from RHEL 7] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade.yml:36 changed: [managed-node01] => {"changed": true, "msg": "", "rc": 0, "results": ["Installed: policycoreutils-python-utils-2.9-26.el8_10.noarch", "Installed: leapp-0.20.0-1.el8_10.noarch", "Installed: leapp-deps-0.20.0-1.el8_10.noarch", "Installed: leapp-upgrade-el8toel9-0.23.0-1.el8_10.noarch", "Installed: leapp-upgrade-el8toel9-deps-0.23.0-1.el8_10.noarch", "Installed: systemd-container-239-82.el8_10.15.x86_64", "Installed: python3-leapp-0.20.0-1.el8_10.noarch"]} TASK [infra.leapp.common : pre_upgrade | Include update-and-reboot.yml] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade.yml:45 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/update-and-reboot.yml for managed-node01 TASK [infra.leapp.common : update-and-reboot | Ensure all updates are applied] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/update-and-reboot.yml:2 ASYNC OK on managed-node01: jid=j443504284291.7034 changed: [managed-node01] => {"ansible_job_id": "j443504284291.7034", "changed": true, "finished": 1, "msg": "", "rc": 0, "results": ["Installed: libatasmart-0.19-14.el8.x86_64", "Installed: libblockdev-2.28-7.el8_10.x86_64", "Installed: libblockdev-crypto-2.28-7.el8_10.x86_64", "Installed: libsmbios-2.4.1-2.el8.x86_64", "Installed: libblockdev-fs-2.28-7.el8_10.x86_64", "Installed: bubblewrap-0.4.0-2.el8_10.x86_64", "Installed: libblockdev-loop-2.28-7.el8_10.x86_64", "Installed: glibc-headers-2.28-251.el8_10.31.x86_64", "Installed: libblockdev-mdraid-2.28-7.el8_10.x86_64", "Installed: udisks2-2.9.0-16.el8_10.1.x86_64", "Installed: libblockdev-part-2.28-7.el8_10.x86_64", "Installed: libblockdev-swap-2.28-7.el8_10.x86_64", "Installed: libblockdev-utils-2.28-7.el8_10.x86_64", "Installed: glibc-langpack-en-2.28-251.el8_10.31.x86_64", "Installed: libudisks2-2.9.0-16.el8_10.1.x86_64", "Installed: libbytesize-1.4-3.el8.x86_64", "Installed: mdadm-4.2-19.el8_10.x86_64", "Installed: libnfsidmap-1:2.3.3-68.el8_10.x86_64", "Installed: fwupd-1.7.8-2.el8.x86_64", "Installed: libpng-2:1.6.34-10.el8_10.x86_64", "Installed: libgcab1-1.1-1.el8.x86_64", "Installed: grub2-common-1:2.02-170.el8_10.1.noarch", "Installed: nfs-utils-1:2.3.3-68.el8_10.x86_64", "Installed: grub2-efi-x64-1:2.02-170.el8_10.1.x86_64", "Installed: grub2-efi-x64-modules-1:2.02-170.el8_10.1.noarch", "Installed: grub2-pc-1:2.02-170.el8_10.1.x86_64", "Installed: grub2-pc-modules-1:2.02-170.el8_10.1.noarch", "Installed: libgudev-232-4.el8.x86_64", "Installed: grub2-tools-efi-1:2.02-170.el8_10.1.x86_64", "Installed: libgusb-0.3.0-1.el8.x86_64", "Installed: grub2-tools-1:2.02-170.el8_10.1.x86_64", "Installed: grub2-tools-extra-1:2.02-170.el8_10.1.x86_64", "Installed: grub2-tools-minimal-1:2.02-170.el8_10.1.x86_64", "Installed: libxmlb-0.1.15-1.el8.x86_64", "Installed: glibc-2.28-251.el8_10.31.x86_64", "Installed: glibc-common-2.28-251.el8_10.31.x86_64", "Installed: glibc-devel-2.28-251.el8_10.31.x86_64", "Installed: volume_key-libs-0.3.11-6.el8.x86_64", "Installed: glibc-gconv-extra-2.28-251.el8_10.31.x86_64", "Installed: dosfstools-4.1-6.el8.x86_64", "Removed: grub2-common-1:2.02-169.el8_10.noarch", "Removed: grub2-efi-x64-1:2.02-169.el8_10.x86_64", "Removed: grub2-efi-x64-modules-1:2.02-169.el8_10.noarch", "Removed: grub2-pc-1:2.02-169.el8_10.x86_64", "Removed: grub2-pc-modules-1:2.02-169.el8_10.noarch", "Removed: grub2-tools-1:2.02-169.el8_10.x86_64", "Removed: grub2-tools-extra-1:2.02-169.el8_10.x86_64", "Removed: grub2-tools-minimal-1:2.02-169.el8_10.x86_64", "Removed: libnfsidmap-1:2.3.3-66.el8_10.x86_64", "Removed: glibc-2.28-251.el8_10.27.x86_64", "Removed: glibc-common-2.28-251.el8_10.27.x86_64", "Removed: glibc-devel-2.28-251.el8_10.27.x86_64", "Removed: glibc-gconv-extra-2.28-251.el8_10.27.x86_64", "Removed: glibc-headers-2.28-251.el8_10.27.x86_64", "Removed: glibc-langpack-en-2.28-251.el8_10.27.x86_64", "Removed: libpng-2:1.6.34-9.el8_10.x86_64", "Removed: dbxtool-8-5.el8_3.2.x86_64", "Removed: nfs-utils-1:2.3.3-66.el8_10.x86_64"], "results_file": "/root/.ansible_async/j443504284291.7034", "started": 1, "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [infra.leapp.common : update-and-reboot | Reboot when updates applied] **** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/update-and-reboot.yml:10 fatal: [managed-node01]: FAILED! => {"msg": "The field 'timeout' has an invalid value, which includes an undefined variable. The error was: 'leapp_reboot_timeout' is undefined. 'leapp_reboot_timeout' is undefined\n\nThe error appears to be in '/root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/update-and-reboot.yml': line 10, column 3, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n- name: update-and-reboot | Reboot when updates applied\n ^ here\n"} TASK [analysis-leapp | Include custom_local_repos for local_repos_post_analysis] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:70 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [analysis-leapp | Restore original Satellite activation key] ************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:80 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [analysis-leapp | Copy reports to the controller] ************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:91 TASK [infra.leapp.common : copy_reports_to_controller | Ensure reports directory on controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_reports_to_controller.yml:20 changed: [managed-node01 -> localhost] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/ansible_leapp_analysis_logs_2026-03-18_13-01-55", "secontext": "unconfined_u:object_r:admin_home_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : copy_reports_to_controller | Fetch report files if they exist] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_reports_to_controller.yml:30 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml for managed-node01 => (item=/var/log/leapp/leapp-report.txt) included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml for managed-node01 => (item=/var/log/leapp/leapp-report.json) included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml for managed-node01 => (item=/var/log/leapp/leapp-preupgrade.log) TASK [infra.leapp.common : fetch_file_if_exists | Check if file exists] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:7 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : fetch_file_if_exists | Copy report file to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:12 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : fetch_file_if_exists | Check if file exists] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:7 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : fetch_file_if_exists | Copy report file to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:12 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : fetch_file_if_exists | Check if file exists] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:7 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : fetch_file_if_exists | Copy report file to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:12 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [copy_reports_to_controller | Copy log file to the controller] ************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_reports_to_controller.yml:39 TASK [infra.leapp.common : copy_archive_leapp_log | Check for log file] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:11 ok: [managed-node01] => {"changed": false, "stat": {"atime": 1773838921.6710172, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "97d170e1550eee4afc0af065b78cda302a97674c", "ctime": 1773838921.6730173, "dev": 51715, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 645922952, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773838921.423018, "nlink": 1, "path": "/var/log/leapp/ansible_leapp_analysis.log", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 2, "uid": 0, "version": "1137689257", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}} TASK [infra.leapp.common : copy_archive_leapp_log | Add end time to log file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:19 changed: [managed-node01] => {"backup": "", "changed": true, "msg": "line added"} TASK [infra.leapp.common : copy_archive_leapp_log | Slurp file /var/log/leapp/ansible_leapp_analysis.log] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:27 ok: [managed-node01] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false} TASK [infra.leapp.common : copy_archive_leapp_log | Decode file /var/log/leapp/ansible_leapp_analysis.log] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:33 ok: [managed-node01] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false} TASK [infra.leapp.common : copy_archive_leapp_log | Ensure reports directory on controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:38 ok: [managed-node01 -> localhost] => {"changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/ansible_leapp_analysis_logs_2026-03-18_13-01-55", "secontext": "unconfined_u:object_r:admin_home_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : copy_archive_leapp_log | Copy ansible leapp log to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:48 changed: [managed-node01] => {"changed": true, "checksum": "b82aaabd41b98dae685c4babba7e3d8c6b273874", "dest": "/root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/ansible_leapp_analysis_logs_2026-03-18_13-01-55/managed-node01/ansible_leapp_analysis.log", "md5sum": "b51e51552de12c3a08a304f1dfccefe1", "remote_checksum": "b82aaabd41b98dae685c4babba7e3d8c6b273874", "remote_md5sum": null} TASK [infra.leapp.common : copy_archive_leapp_log | Copy log file to timestamped location] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:54 changed: [managed-node01] => {"changed": true, "checksum": "b82aaabd41b98dae685c4babba7e3d8c6b273874", "dest": "/var/log/leapp/ansible_leapp_analysis_2026-03-18_13-01-55.log", "gid": 0, "group": "root", "md5sum": "b51e51552de12c3a08a304f1dfccefe1", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 37, "src": "/var/log/leapp/ansible_leapp_analysis.log", "state": "file", "uid": 0} TASK [infra.leapp.common : copy_archive_leapp_log | Remove original log file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:66 changed: [managed-node01] => {"changed": true, "path": "/var/log/leapp/ansible_leapp_analysis.log", "state": "absent"} TASK [Test | Include cleanup logs] ********************************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/tests_default.yml:13 TASK [infra.leapp.common : cleanup_logs | Cleanup | Remove log files] ********** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/cleanup_logs.yml:2 changed: [managed-node01] => {"changed": true, "cmd": "set -euxo pipefail\nrm -f /var/log/leapp/*.log\nrm -f /var/log/leapp/*.json\nrm -f /var/log/leapp/*.txt\n", "delta": "0:00:00.005746", "end": "2026-03-18 09:03:17.567485", "msg": "", "rc": 0, "start": "2026-03-18 09:03:17.561739", "stderr": "+ rm -f /var/log/leapp/ansible_leapp_analysis_2026-03-18_13-01-55.log\n+ rm -f '/var/log/leapp/*.json'\n+ rm -f '/var/log/leapp/*.txt'", "stderr_lines": ["+ rm -f /var/log/leapp/ansible_leapp_analysis_2026-03-18_13-01-55.log", "+ rm -f '/var/log/leapp/*.json'", "+ rm -f '/var/log/leapp/*.txt'"], "stdout": "", "stdout_lines": []} PLAY RECAP ********************************************************************* managed-node01 : ok=31 changed=13 unreachable=0 failed=1 skipped=10 rescued=0 ignored=0 -- Logs begin at Wed 2026-03-18 08:57:57 EDT, end at Wed 2026-03-18 09:03:17 EDT. -- Mar 18 09:01:54 managed-node01 sshd[4615]: Accepted publickey for root from 10.31.40.234 port 41796 ssh2: ECDSA SHA256:Vge93M0aHCBgo1IUfcKx6Yq8LKsqsMC5D+QXx8ms+30 Mar 18 09:01:54 managed-node01 systemd[1]: Started Session 8 of user root. -- Subject: Unit session-8.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-8.scope has finished starting up. -- -- The start-up result is done. Mar 18 09:01:54 managed-node01 systemd-logind[603]: New session 8 of user root. -- Subject: A new session 8 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 8 has been created for the user root. -- -- The leading process of the session is 4615. Mar 18 09:01:54 managed-node01 sshd[4615]: pam_unix(sshd:session): session opened for user root by (uid=0) Mar 18 09:01:55 managed-node01 platform-python[4739]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Mar 18 09:01:56 managed-node01 platform-python[4873]: ansible-ansible.builtin.file Invoked with path=/var/log/leapp state=directory owner=root group=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Mar 18 09:01:56 managed-node01 platform-python[4978]: ansible-ansible.builtin.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 18 09:01:57 managed-node01 platform-python[5083]: ansible-ansible.legacy.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 18 09:01:57 managed-node01 platform-python[5167]: ansible-ansible.legacy.copy Invoked with dest=/var/log/leapp/ansible_leapp_analysis.log owner=root group=root mode=0644 src=/root/.ansible/tmp/ansible-tmp-1773838916.8280566-5744-269004054828565/source _original_basename=tmp8y7ltfc4 follow=False checksum=9b939dd04ee0ee57ece2b51ab712679728192e47 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Mar 18 09:01:57 managed-node01 platform-python[5274]: ansible-ansible.builtin.file Invoked with path=/etc/ansible/facts.d state=directory mode=0755 owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Mar 18 09:01:58 managed-node01 platform-python[5379]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/pre_ipu.fact follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 18 09:01:58 managed-node01 platform-python[5463]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/pre_ipu.fact mode=0644 owner=root group=root src=/root/.ansible/tmp/ansible-tmp-1773838917.9049356-5772-236862838640271/source _original_basename=tmpu64zpfo1 follow=False checksum=fb7dd957f444ef9a7db8fd28f040bb66ad7fe777 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Mar 18 09:01:58 managed-node01 platform-python[5570]: ansible-ansible.legacy.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 18 09:01:59 managed-node01 platform-python[5656]: ansible-ansible.legacy.copy Invoked with dest=/var/log/leapp/ansible_leapp_analysis.log mode=0644 owner=root group=root src=/root/.ansible/tmp/ansible-tmp-1773838918.5316932-5772-181941647262429/source _original_basename=tmpqj65fl5s follow=False checksum=fb7dd957f444ef9a7db8fd28f040bb66ad7fe777 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Mar 18 09:01:59 managed-node01 platform-python[5763]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\.|+]el8' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Mar 18 09:02:00 managed-node01 platform-python[5873]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/non_rhel_packages.fact follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 18 09:02:01 managed-node01 platform-python[5957]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/non_rhel_packages.fact mode=0644 owner=root group=root src=/root/.ansible/tmp/ansible-tmp-1773838920.5072963-5822-178778724132474/source _original_basename=tmpsjxjxw3r follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Mar 18 09:02:01 managed-node01 platform-python[6064]: ansible-ansible.legacy.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 18 09:02:01 managed-node01 platform-python[6150]: ansible-ansible.legacy.copy Invoked with dest=/var/log/leapp/ansible_leapp_analysis.log mode=0644 owner=root group=root src=/root/.ansible/tmp/ansible-tmp-1773838921.111843-5822-175046227805094/source _original_basename=tmp2iahlh51 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Mar 18 09:02:02 managed-node01 platform-python[6257]: ansible-ansible.legacy.command Invoked with _raw_params=dnf versionlock list _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Mar 18 09:02:03 managed-node01 platform-python[6364]: ansible-ansible.legacy.dnf Invoked with name=['leapp-upgrade'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Mar 18 09:02:09 managed-node01 dbus-daemon[597]: [system] Reloaded configuration Mar 18 09:02:09 managed-node01 dbus-daemon[597]: [system] Reloaded configuration Mar 18 09:02:09 managed-node01 dbus-daemon[597]: [system] Reloaded configuration Mar 18 09:02:09 managed-node01 dbus-daemon[597]: [system] Reloaded configuration Mar 18 09:02:09 managed-node01 dbus-daemon[597]: [system] Reloaded configuration Mar 18 09:02:09 managed-node01 dbus-daemon[597]: [system] Reloaded configuration Mar 18 09:02:09 managed-node01 dbus-daemon[597]: [system] Reloaded configuration Mar 18 09:02:10 managed-node01 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. -- Subject: Unit run-rc5a958623dd64b3aaaaead23135d002b.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit run-rc5a958623dd64b3aaaaead23135d002b.service has finished starting up. -- -- The start-up result is done. Mar 18 09:02:11 managed-node01 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 18 09:02:11 managed-node01 systemd[1]: Starting man-db-cache-update.service... -- Subject: Unit man-db-cache-update.service has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit man-db-cache-update.service has begun starting up. Mar 18 09:02:11 managed-node01 systemd[1]: Reloading. Mar 18 09:02:12 managed-node01 systemd[1]: man-db-cache-update.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit man-db-cache-update.service has successfully entered the 'dead' state. Mar 18 09:02:12 managed-node01 systemd[1]: Started man-db-cache-update.service. -- Subject: Unit man-db-cache-update.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit man-db-cache-update.service has finished starting up. -- -- The start-up result is done. Mar 18 09:02:12 managed-node01 systemd[1]: run-rc5a958623dd64b3aaaaead23135d002b.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit run-rc5a958623dd64b3aaaaead23135d002b.service has successfully entered the 'dead' state. Mar 18 09:02:12 managed-node01 ansible-async_wrapper.py[7034]: Invoked with j443504284291 7200 /root/.ansible/tmp/ansible-tmp-1773838931.9635458-5918-38564356110799/AnsiballZ_dnf.py _ Mar 18 09:02:12 managed-node01 ansible-async_wrapper.py[7038]: Starting module and watcher Mar 18 09:02:12 managed-node01 ansible-async_wrapper.py[7038]: Start watching 7039 (7200) Mar 18 09:02:12 managed-node01 ansible-async_wrapper.py[7039]: Start module (7039) Mar 18 09:02:12 managed-node01 ansible-async_wrapper.py[7034]: Return async_wrapper task started. Mar 18 09:02:12 managed-node01 platform-python[7040]: ansible-ansible.legacy.dnf Invoked with name=['*'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Mar 18 09:02:17 managed-node01 ansible-async_wrapper.py[7038]: 7039 still running (7200) Mar 18 09:02:21 managed-node01 systemd[1]: Stopping Command Scheduler... -- Subject: Unit crond.service has begun shutting down -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit crond.service has begun shutting down. Mar 18 09:02:21 managed-node01 crond[1368]: (CRON) INFO (Shutting down) Mar 18 09:02:21 managed-node01 systemd[1]: crond.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit crond.service has successfully entered the 'dead' state. Mar 18 09:02:21 managed-node01 systemd[1]: Stopped Command Scheduler. -- Subject: Unit crond.service has finished shutting down -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit crond.service has finished shutting down. Mar 18 09:02:21 managed-node01 systemd[1]: crond.service: Found left-over process 4567 (anacron) in control group while starting unit. Ignoring. Mar 18 09:02:21 managed-node01 systemd[1]: This usually indicates unclean termination of a previous run, or service implementation deficiencies. Mar 18 09:02:21 managed-node01 systemd[1]: Started Command Scheduler. -- Subject: Unit crond.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit crond.service has finished starting up. -- -- The start-up result is done. Mar 18 09:02:21 managed-node01 crond[7054]: (CRON) STARTUP (1.5.2) Mar 18 09:02:21 managed-node01 crond[7054]: (CRON) INFO (Syslog will be used instead of sendmail.) Mar 18 09:02:21 managed-node01 crond[7054]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 85% if used.) Mar 18 09:02:21 managed-node01 crond[7054]: (CRON) INFO (running with inotify support) Mar 18 09:02:21 managed-node01 crond[7054]: (CRON) INFO (@reboot jobs will be run at computer's startup.) Mar 18 09:02:22 managed-node01 ansible-async_wrapper.py[7038]: 7039 still running (7195) Mar 18 09:02:22 managed-node01 dbus-daemon[597]: [system] Reloaded configuration Mar 18 09:02:22 managed-node01 dbus-daemon[597]: [system] Reloaded configuration Mar 18 09:02:22 managed-node01 dbus-daemon[597]: [system] Reloaded configuration Mar 18 09:02:22 managed-node01 dbus-daemon[597]: [system] Reloaded configuration Mar 18 09:02:22 managed-node01 systemd-udevd[527]: Network interface NamePolicy= disabled on kernel command line, ignoring. Mar 18 09:02:22 managed-node01 systemd-logind[603]: Watching system buttons on /dev/input/event0 (Power Button) Mar 18 09:02:22 managed-node01 systemd-logind[603]: Watching system buttons on /dev/input/event1 (Sleep Button) Mar 18 09:02:22 managed-node01 systemd-udevd[7075]: Using default interface naming scheme 'rhel-8.0'. Mar 18 09:02:23 managed-node01 systemd-logind[603]: Watching system buttons on /dev/input/event2 (AT Translated Set 2 keyboard) Mar 18 09:02:23 managed-node01 systemd[1]: Stopping GSSAPI Proxy Daemon... -- Subject: Unit gssproxy.service has begun shutting down -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has begun shutting down. Mar 18 09:02:23 managed-node01 systemd[1]: gssproxy.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit gssproxy.service has successfully entered the 'dead' state. Mar 18 09:02:23 managed-node01 systemd[1]: Stopped GSSAPI Proxy Daemon. -- Subject: Unit gssproxy.service has finished shutting down -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has finished shutting down. Mar 18 09:02:23 managed-node01 systemd[1]: Starting GSSAPI Proxy Daemon... -- Subject: Unit gssproxy.service has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has begun starting up. Mar 18 09:02:23 managed-node01 systemd[1]: Started GSSAPI Proxy Daemon. -- Subject: Unit gssproxy.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has finished starting up. -- -- The start-up result is done. Mar 18 09:02:23 managed-node01 systemd[1]: Stopping GSSAPI Proxy Daemon... -- Subject: Unit gssproxy.service has begun shutting down -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has begun shutting down. Mar 18 09:02:23 managed-node01 systemd[1]: gssproxy.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit gssproxy.service has successfully entered the 'dead' state. Mar 18 09:02:23 managed-node01 systemd[1]: Stopped GSSAPI Proxy Daemon. -- Subject: Unit gssproxy.service has finished shutting down -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has finished shutting down. Mar 18 09:02:23 managed-node01 systemd[1]: Starting GSSAPI Proxy Daemon... -- Subject: Unit gssproxy.service has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has begun starting up. Mar 18 09:02:23 managed-node01 systemd[1]: Started GSSAPI Proxy Daemon. -- Subject: Unit gssproxy.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has finished starting up. -- -- The start-up result is done. Mar 18 09:02:23 managed-node01 systemd[1]: Stopping GSSAPI Proxy Daemon... -- Subject: Unit gssproxy.service has begun shutting down -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has begun shutting down. Mar 18 09:02:23 managed-node01 systemd[1]: gssproxy.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit gssproxy.service has successfully entered the 'dead' state. Mar 18 09:02:23 managed-node01 systemd[1]: Stopped GSSAPI Proxy Daemon. -- Subject: Unit gssproxy.service has finished shutting down -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has finished shutting down. Mar 18 09:02:23 managed-node01 systemd[1]: Starting GSSAPI Proxy Daemon... -- Subject: Unit gssproxy.service has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has begun starting up. Mar 18 09:02:23 managed-node01 systemd[1]: Started GSSAPI Proxy Daemon. -- Subject: Unit gssproxy.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit gssproxy.service has finished starting up. -- -- The start-up result is done. Mar 18 09:02:23 managed-node01 dbus-daemon[597]: [system] Reloaded configuration Mar 18 09:02:23 managed-node01 dbus-daemon[597]: [system] Reloaded configuration Mar 18 09:02:23 managed-node01 dbus-daemon[597]: [system] Reloaded configuration Mar 18 09:02:23 managed-node01 dbus-daemon[597]: [system] Reloaded configuration Mar 18 09:02:23 managed-node01 polkitd[938]: Reloading rules Mar 18 09:02:23 managed-node01 polkitd[938]: Collecting garbage unconditionally... Mar 18 09:02:23 managed-node01 polkitd[938]: Loading rules from directory /etc/polkit-1/rules.d Mar 18 09:02:23 managed-node01 polkitd[938]: Loading rules from directory /usr/share/polkit-1/rules.d Mar 18 09:02:23 managed-node01 polkitd[938]: Finished loading, compiling and executing 3 rules Mar 18 09:02:23 managed-node01 polkitd[938]: Reloading rules Mar 18 09:02:23 managed-node01 polkitd[938]: Collecting garbage unconditionally... Mar 18 09:02:23 managed-node01 polkitd[938]: Loading rules from directory /etc/polkit-1/rules.d Mar 18 09:02:23 managed-node01 polkitd[938]: Loading rules from directory /usr/share/polkit-1/rules.d Mar 18 09:02:23 managed-node01 polkitd[938]: Finished loading, compiling and executing 3 rules Mar 18 09:02:24 managed-node01 systemd[1]: Stopped target NFS client services. -- Subject: Unit nfs-client.target has finished shutting down -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit nfs-client.target has finished shutting down. Mar 18 09:02:24 managed-node01 systemd[1]: Stopping NFS client services. -- Subject: Unit nfs-client.target has begun shutting down -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit nfs-client.target has begun shutting down. Mar 18 09:02:24 managed-node01 systemd[1]: Reached target NFS client services. -- Subject: Unit nfs-client.target has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit nfs-client.target has finished starting up. -- -- The start-up result is done. Mar 18 09:02:24 managed-node01 systemd[1]: Reloading. Mar 18 09:02:24 managed-node01 systemd[1]: Reloading. Mar 18 09:02:25 managed-node01 systemd-udevd[527]: Network interface NamePolicy= disabled on kernel command line, ignoring. Mar 18 09:02:25 managed-node01 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. -- Subject: Unit run-rfb711883a73a4d3aa1e9508722b23672.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit run-rfb711883a73a4d3aa1e9508722b23672.service has finished starting up. -- -- The start-up result is done. Mar 18 09:02:25 managed-node01 systemd[1]: Starting man-db-cache-update.service... -- Subject: Unit man-db-cache-update.service has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit man-db-cache-update.service has begun starting up. Mar 18 09:02:25 managed-node01 systemd[1]: Reloading. Mar 18 09:02:25 managed-node01 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. -- Subject: Unit run-rd86c19d8fe524ff28824558fc3a3e1ed.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit run-rd86c19d8fe524ff28824558fc3a3e1ed.service has finished starting up. -- -- The start-up result is done. Mar 18 09:02:26 managed-node01 ansible-async_wrapper.py[7039]: Module complete (7039) Mar 18 09:02:26 managed-node01 systemd[1]: man-db-cache-update.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit man-db-cache-update.service has successfully entered the 'dead' state. Mar 18 09:02:26 managed-node01 systemd[1]: Started man-db-cache-update.service. -- Subject: Unit man-db-cache-update.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit man-db-cache-update.service has finished starting up. -- -- The start-up result is done. Mar 18 09:02:26 managed-node01 systemd[1]: run-rd86c19d8fe524ff28824558fc3a3e1ed.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit run-rd86c19d8fe524ff28824558fc3a3e1ed.service has successfully entered the 'dead' state. Mar 18 09:02:26 managed-node01 systemd[1]: run-rfb711883a73a4d3aa1e9508722b23672.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit run-rfb711883a73a4d3aa1e9508722b23672.service has successfully entered the 'dead' state. Mar 18 09:02:27 managed-node01 ansible-async_wrapper.py[7038]: Done in kid B. Mar 18 09:03:11 managed-node01 sshd[4618]: Received disconnect from 10.31.40.234 port 41796:11: disconnected by user Mar 18 09:03:11 managed-node01 sshd[4618]: Disconnected from user root 10.31.40.234 port 41796 Mar 18 09:03:11 managed-node01 sshd[4615]: pam_unix(sshd:session): session closed for user root Mar 18 09:03:11 managed-node01 systemd[1]: session-8.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-8.scope has successfully entered the 'dead' state. Mar 18 09:03:11 managed-node01 systemd-logind[603]: Session 8 logged out. Waiting for processes to exit. Mar 18 09:03:11 managed-node01 systemd-logind[603]: Removed session 8. -- Subject: Session 8 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 8 has been terminated. Mar 18 09:03:12 managed-node01 sshd[9427]: Accepted publickey for root from 10.31.40.234 port 53414 ssh2: ECDSA SHA256:Vge93M0aHCBgo1IUfcKx6Yq8LKsqsMC5D+QXx8ms+30 Mar 18 09:03:12 managed-node01 systemd[1]: Started Session 9 of user root. -- Subject: Unit session-9.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-9.scope has finished starting up. -- -- The start-up result is done. Mar 18 09:03:12 managed-node01 systemd-logind[603]: New session 9 of user root. -- Subject: A new session 9 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 9 has been created for the user root. -- -- The leading process of the session is 9427. Mar 18 09:03:12 managed-node01 sshd[9427]: pam_unix(sshd:session): session opened for user root by (uid=0) Mar 18 09:03:12 managed-node01 platform-python[9533]: ansible-ansible.legacy.async_status Invoked with jid=j443504284291.7034 mode=status _async_dir=/root/.ansible_async Mar 18 09:03:13 managed-node01 platform-python[9601]: ansible-ansible.legacy.async_status Invoked with jid=j443504284291.7034 mode=cleanup _async_dir=/root/.ansible_async Mar 18 09:03:13 managed-node01 platform-python[9706]: ansible-ansible.builtin.stat Invoked with path=/var/log/leapp/leapp-report.txt follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 18 09:03:14 managed-node01 platform-python[9811]: ansible-ansible.builtin.stat Invoked with path=/var/log/leapp/leapp-report.json follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 18 09:03:14 managed-node01 platform-python[9916]: ansible-ansible.builtin.stat Invoked with path=/var/log/leapp/leapp-preupgrade.log follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 18 09:03:14 managed-node01 platform-python[10021]: ansible-ansible.builtin.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 18 09:03:15 managed-node01 platform-python[10128]: ansible-ansible.builtin.lineinfile Invoked with path=/var/log/leapp/ansible_leapp_analysis.log line=Job ended at 2026-03-18T13:03:15Z owner=root group=root mode=0644 state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None seuser=None serole=None selevel=None setype=None attributes=None Mar 18 09:03:16 managed-node01 platform-python[10338]: ansible-ansible.legacy.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=True get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 18 09:03:16 managed-node01 platform-python[10460]: ansible-ansible.legacy.copy Invoked with src=/var/log/leapp/ansible_leapp_analysis.log dest=/var/log/leapp/ansible_leapp_analysis_2026-03-18_13-01-55.log remote_src=True mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Mar 18 09:03:17 managed-node01 platform-python[10567]: ansible-ansible.builtin.file Invoked with path=/var/log/leapp/ansible_leapp_analysis.log state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Mar 18 09:03:17 managed-node01 platform-python[10672]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail rm -f /var/log/leapp/*.log rm -f /var/log/leapp/*.json rm -f /var/log/leapp/*.txt _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Mar 18 09:03:17 managed-node01 sshd[10693]: Accepted publickey for root from 10.31.40.234 port 53422 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Mar 18 09:03:17 managed-node01 systemd[1]: Started Session 10 of user root. -- Subject: Unit session-10.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-10.scope has finished starting up. -- -- The start-up result is done. Mar 18 09:03:17 managed-node01 systemd-logind[603]: New session 10 of user root. -- Subject: A new session 10 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 10 has been created for the user root. -- -- The leading process of the session is 10693. Mar 18 09:03:17 managed-node01 sshd[10693]: pam_unix(sshd:session): session opened for user root by (uid=0) Mar 18 09:03:17 managed-node01 sshd[10696]: Received disconnect from 10.31.40.234 port 53422:11: disconnected by user Mar 18 09:03:17 managed-node01 sshd[10696]: Disconnected from user root 10.31.40.234 port 53422 Mar 18 09:03:17 managed-node01 sshd[10693]: pam_unix(sshd:session): session closed for user root Mar 18 09:03:17 managed-node01 systemd[1]: session-10.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-10.scope has successfully entered the 'dead' state. Mar 18 09:03:17 managed-node01 systemd-logind[603]: Session 10 logged out. Waiting for processes to exit. Mar 18 09:03:17 managed-node01 systemd-logind[603]: Removed session 10. -- Subject: Session 10 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 10 has been terminated. Mar 18 09:03:17 managed-node01 sshd[10714]: Accepted publickey for root from 10.31.40.234 port 53424 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Mar 18 09:03:17 managed-node01 systemd[1]: Started Session 11 of user root. -- Subject: Unit session-11.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-11.scope has finished starting up. -- -- The start-up result is done. Mar 18 09:03:17 managed-node01 systemd-logind[603]: New session 11 of user root. -- Subject: A new session 11 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 11 has been created for the user root. -- -- The leading process of the session is 10714. Mar 18 09:03:17 managed-node01 sshd[10714]: pam_unix(sshd:session): session opened for user root by (uid=0)