[WARNING]: Collection infra.leapp does not support Ansible version 2.14.18 [WARNING]: running playbook inside collection infra.leapp ansible-playbook [core 2.14.18] config file = /etc/ansible/ansible.cfg configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python3.9/site-packages/ansible ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections executable location = /usr/bin/ansible-playbook python version = 3.9.25 (main, Mar 9 2026, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-14)] (/usr/bin/python3) jinja version = 3.1.2 libyaml = True Using /etc/ansible/ansible.cfg as config file Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_hostvars.yml *************************************************** 1 plays in /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml PLAY [Test how analysis role generates hostvars file when it exists] *********** TASK [Gathering Facts] ********************************************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:2 ok: [managed-node01] TASK [Сommon_upgrade_tasks | Remove leapp packages] **************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:12 changed: [managed-node01] => {"changed": true, "msg": "", "rc": 0, "results": ["Removed: leapp-upgrade-el9toel10-0.23.0-1.el9.noarch"]} TASK [Сommon_upgrade_tasks | Gather setup tasks] ******************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:17 ok: [managed-node01 -> localhost] => {"changed": false, "examined": 4, "files": [{"atime": 1773930825.9419975, "ctime": 1773930825.7579947, "dev": 51716, "gid": 0, "gr_name": "root", "inode": 746586270, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773930825.7579947, "nlink": 1, "path": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_cifs.yml", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 272, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1773930825.9419975, "ctime": 1773930825.7579947, "dev": 51716, "gid": 0, "gr_name": "root", "inode": 746586271, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773930825.7579947, "nlink": 1, "path": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_remote_using_root.yml", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 268, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1773930825.9419975, "ctime": 1773930825.7579947, "dev": 51716, "gid": 0, "gr_name": "root", "inode": 746586272, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773930825.7579947, "nlink": 1, "path": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_removed_kernel_drivers.yml", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 913, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1773930825.9419975, "ctime": 1773930825.7579947, "dev": 51716, "gid": 0, "gr_name": "root", "inode": 746586273, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773930825.7579947, "nlink": 1, "path": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/version_lock.yml", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 548, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}], "matched": 4, "msg": "All paths examined", "skipped_paths": {}} TASK [Сommon_upgrade_tasks | Do remediation setup tasks] *********************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:26 skipping: [managed-node01] => (item=/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_cifs.yml) => {"ansible_loop_var": "setup_task_file", "changed": false, "setup_task_file": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_cifs.yml", "skip_reason": "Conditional result was False"} skipping: [managed-node01] => (item=/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_remote_using_root.yml) => {"ansible_loop_var": "setup_task_file", "changed": false, "setup_task_file": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_remote_using_root.yml", "skip_reason": "Conditional result was False"} skipping: [managed-node01] => (item=/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_removed_kernel_drivers.yml) => {"ansible_loop_var": "setup_task_file", "changed": false, "setup_task_file": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_removed_kernel_drivers.yml", "skip_reason": "Conditional result was False"} skipping: [managed-node01] => {"changed": false, "msg": "All items skipped"} TASK [Сommon_upgrade_tasks | Do setup tasks] *********************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:37 included: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/version_lock.yml for managed-node01 => (item=/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/version_lock.yml) TASK [setup | version_lock | Install versionlock module] *********************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/version_lock.yml:2 changed: [managed-node01] => {"changed": true, "msg": "", "rc": 0, "results": ["Installed: python3-dnf-plugin-versionlock-4.3.0-24.el9_7.noarch"]} TASK [setup | version_lock | Version lock the dracut package] ****************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/version_lock.yml:7 changed: [managed-node01] => {"changed": true, "cmd": ["dnf", "versionlock", "add", "dracut"], "delta": "0:00:00.558928", "end": "2026-03-19 10:41:37.681719", "msg": "", "rc": 0, "start": "2026-03-19 10:41:37.122791", "stderr": "", "stderr_lines": [], "stdout": "Updating Subscription Management repositories.\nUnable to read consumer identity\n\nThis system is not registered with an entitlement server. You can use \"rhc\" or \"subscription-manager\" to register.\n\nLast metadata expiration check: 0:01:06 ago on Thu 19 Mar 2026 10:40:31 AM EDT.\nAdding versionlock on: dracut-0:057-104.git20250919.el9_7.*", "stdout_lines": ["Updating Subscription Management repositories.", "Unable to read consumer identity", "", "This system is not registered with an entitlement server. You can use \"rhc\" or \"subscription-manager\" to register.", "", "Last metadata expiration check: 0:01:06 ago on Thu 19 Mar 2026 10:40:31 AM EDT.", "Adding versionlock on: dracut-0:057-104.git20250919.el9_7.*"]} TASK [Test | Create tempdir directory for workdir controller] ****************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:53 changed: [managed-node01 -> localhost] => {"changed": true, "gid": 0, "group": "root", "mode": "0700", "owner": "root", "path": "/tmp/workdir_controller8yrym8we", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [Test | Prepare hostvars directory] *************************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:59 changed: [managed-node01 -> localhost] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/tmp/workdir_controller8yrym8we/host_vars", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [Test | Set fact with hostvars file] ************************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:65 ok: [managed-node01 -> localhost] => {"ansible_facts": {"leapp_hostvars_file": "/tmp/workdir_controller8yrym8we/host_vars/managed-node01.yml"}, "changed": false} TASK [Test | Write a sample hostvars file] ************************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:69 changed: [managed-node01 -> localhost] => {"changed": true, "checksum": "165076eb6b9fac25e39014723e3cdb9b570bce81", "dest": "/tmp/workdir_controller8yrym8we/host_vars/managed-node01.yml", "gid": 0, "group": "root", "md5sum": "b94177b40d078db8db5b3fcf744d19b1", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:admin_home_t:s0", "size": 221, "src": "/root/.ansible/tmp/ansible-tmp-1773931299.2229424-9510-230545510627304/source", "state": "file", "uid": 0} TASK [Test | Print hostvars file content] ************************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:87 ok: [managed-node01 -> localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false} TASK [Test | Run role analysis] ************************************************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:93 TASK [infra.leapp.analysis : Lock timestamped variables] *********************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:5 ok: [managed-node01] => {"ansible_facts": {"__leapp_timestamp": "2026-03-19_14-41-41"}, "changed": false} TASK [Initialize lock, logging, and common vars] ******************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:9 TASK [infra.leapp.common : init_leapp_log | Ensure that log directory exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:9 changed: [managed-node01] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/var/log/leapp", "secontext": "system_u:object_r:var_log_t:s0", "size": 69, "state": "directory", "uid": 0} TASK [infra.leapp.common : init_leapp_log | Check for existing log file] ******* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:17 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : init_leapp_log | Fail if log file already exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:22 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : init_leapp_log | Create new log file] *************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:31 changed: [managed-node01] => {"changed": true, "checksum": "291d60f120986a014e6b93d58604fa7367bb7c2b", "dest": "/var/log/leapp/ansible_leapp_analysis.log", "gid": 0, "group": "root", "md5sum": "2f3329d5b6faa666cc958c381cf39fcc", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 70, "src": "/root/.ansible/tmp/ansible-tmp-1773931302.3199677-9744-172705783645666/source", "state": "file", "uid": 0} TASK [infra.leapp.common : init_leapp_log | /etc/ansible/facts.d directory exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:41 ok: [managed-node01] => {"changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/ansible/facts.d", "secontext": "unconfined_u:object_r:etc_t:s0", "size": 56, "state": "directory", "uid": 0} TASK [infra.leapp.common : init_leapp_log | Capture current ansible_facts for validation after upgrade] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:49 changed: [managed-node01] => (item=/etc/ansible/facts.d/pre_ipu.fact) => {"ansible_loop_var": "item", "changed": true, "checksum": "1cb12ee832733f8bc6d91d254b9aa733d6617829", "dest": "/etc/ansible/facts.d/pre_ipu.fact", "gid": 0, "group": "root", "item": "/etc/ansible/facts.d/pre_ipu.fact", "md5sum": "f794c0944eef2845efc501a420561238", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 14664, "src": "/root/.ansible/tmp/ansible-tmp-1773931303.7620137-9852-17060679618345/source", "state": "file", "uid": 0} changed: [managed-node01] => (item=/var/log/leapp/ansible_leapp_analysis.log) => {"ansible_loop_var": "item", "changed": true, "checksum": "1cb12ee832733f8bc6d91d254b9aa733d6617829", "dest": "/var/log/leapp/ansible_leapp_analysis.log", "gid": 0, "group": "root", "item": "/var/log/leapp/ansible_leapp_analysis.log", "md5sum": "f794c0944eef2845efc501a420561238", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 14664, "src": "/root/.ansible/tmp/ansible-tmp-1773931304.6849356-9852-11731134780159/source", "state": "file", "uid": 0} TASK [infra.leapp.common : init_leapp_log | Capture a list of non-rhel versioned packages] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:60 ok: [managed-node01] => {"changed": false, "cmd": "set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\\.|+]el9' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort", "delta": "0:00:00.221649", "end": "2026-03-19 10:41:46.071266", "failed_when_result": false, "msg": "non-zero return code", "rc": 1, "start": "2026-03-19 10:41:45.849617", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [infra.leapp.common : init_leapp_log | Create fact with the non-rhel versioned packages list] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:74 ok: [managed-node01] => {"ansible_facts": {"non_rhel_packages": []}, "changed": false} TASK [infra.leapp.common : init_leapp_log | Capture the list of non-rhel versioned packages in a separate fact file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:78 ok: [managed-node01] => (item=/etc/ansible/facts.d/non_rhel_packages.fact) => {"ansible_loop_var": "item", "changed": false, "checksum": "97d170e1550eee4afc0af065b78cda302a97674c", "dest": "/etc/ansible/facts.d/non_rhel_packages.fact", "gid": 0, "group": "root", "item": "/etc/ansible/facts.d/non_rhel_packages.fact", "mode": "0644", "owner": "root", "path": "/etc/ansible/facts.d/non_rhel_packages.fact", "secontext": "system_u:object_r:etc_t:s0", "size": 2, "state": "file", "uid": 0} changed: [managed-node01] => (item=/var/log/leapp/ansible_leapp_analysis.log) => {"ansible_loop_var": "item", "changed": true, "checksum": "97d170e1550eee4afc0af065b78cda302a97674c", "dest": "/var/log/leapp/ansible_leapp_analysis.log", "gid": 0, "group": "root", "item": "/var/log/leapp/ansible_leapp_analysis.log", "md5sum": "d751713988987e9331980363e24189ce", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 2, "src": "/root/.ansible/tmp/ansible-tmp-1773931307.0333068-10143-2422295405688/source", "state": "file", "uid": 0} TASK [infra.leapp.analysis : Include tasks for preupg assistant analysis] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:19 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : Include tasks for leapp preupgrade analysis] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:23 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml for managed-node01 TASK [analysis-leapp | Include pre_upgrade_update.yml] ************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:4 TASK [infra.leapp.common : pre_upgrade_update | Register with Satellite activation key] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade_update.yml:3 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [pre_upgrade_update | Include custom_local_repos for local_repos_pre_leapp] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade_update.yml:10 TASK [infra.leapp.common : custom_local_repos | Remove old /etc/leapp/files/leapp_upgrade_repositories.repo] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:2 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : custom_local_repos | Validate repo definitions have baseurl or metalink] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:9 skipping: [managed-node01] => {"changed": false, "skipped_reason": "No items in the list"} TASK [infra.leapp.common : custom_local_repos | Enable custom upgrade yum repositories] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:16 skipping: [managed-node01] => {"changed": false, "skipped_reason": "No items in the list"} TASK [infra.leapp.common : pre_upgrade_update | Get package version lock entries] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade_update.yml:21 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : pre_upgrade_update | Remove all package version locks] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade_update.yml:28 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : pre_upgrade_update | Install packages for upgrade from RHEL 7] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade_update.yml:36 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : pre_upgrade_update | Include update-and-reboot.yml] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade_update.yml:45 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : analysis-leapp | Ensure leapp log directory exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:11 changed: [managed-node01] => {"changed": true, "gid": 0, "group": "root", "mode": "0700", "owner": "root", "path": "/var/log/leapp", "secontext": "system_u:object_r:var_log_t:s0", "size": 103, "state": "directory", "uid": 0} TASK [infra.leapp.analysis : analysis-leapp | Populate leapp_answers file] ***** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:19 changed: [managed-node01] => {"changed": true, "checksum": "3d934ad808576e3a7fb4c14a89645a4ad55ccf53", "dest": "/var/log/leapp/answerfile", "gid": 0, "group": "root", "md5sum": "01e375235c8e4cafdec593b260354063", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:var_log_t:s0", "size": 48, "src": "/root/.ansible/tmp/ansible-tmp-1773931309.27762-10471-227346497135950/source", "state": "file", "uid": 0} TASK [analysis-leapp | Create /etc/leapp/files/leapp_upgrade_repositories.repo] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:28 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : analysis-leapp | Leapp preupgrade report] ********* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:39 ASYNC FAILED on managed-node01: jid=j632050642122.14227 fatal: [managed-node01]: FAILED! => {"ansible_job_id": "j632050642122.14227", "changed": true, "cmd": "set -o pipefail; export PATH=$PATH; ulimit -n 16384; leapp preupgrade --report-schema=1.2.0 --no-rhsm 2>&1 | tee -a /var/log/leapp/ansible_leapp_analysis.log\n", "delta": "0:00:00.072352", "end": "2026-03-19 10:41:51.112870", "failed_when_result": true, "finished": 1, "msg": "non-zero return code", "rc": 1, "results_file": "/root/.ansible_async/j632050642122.14227", "start": "2026-03-19 10:41:51.040518", "started": 1, "stderr": "", "stderr_lines": [], "stdout": "Command \"preupgrade\" is unknown.\nMost likely there is a typo in the command or particular leapp repositories that provide this command are not present on the system.\nYou can try to install the missing content e.g. by the following command: `dnf install 'leapp-command(preupgrade)'`", "stdout_lines": ["Command \"preupgrade\" is unknown.", "Most likely there is a typo in the command or particular leapp repositories that provide this command are not present on the system.", "You can try to install the missing content e.g. by the following command: `dnf install 'leapp-command(preupgrade)'`"]} TASK [analysis-leapp | Include custom_local_repos for local_repos_post_analysis] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:70 TASK [infra.leapp.common : custom_local_repos | Remove old /etc/leapp/files/leapp_upgrade_repositories.repo] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:2 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : custom_local_repos | Validate repo definitions have baseurl or metalink] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:9 skipping: [managed-node01] => {"changed": false, "skipped_reason": "No items in the list"} TASK [infra.leapp.common : custom_local_repos | Enable custom upgrade yum repositories] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:16 skipping: [managed-node01] => {"changed": false, "skipped_reason": "No items in the list"} TASK [analysis-leapp | Restore original Satellite activation key] ************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:80 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [analysis-leapp | Copy reports to the controller] ************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:91 TASK [infra.leapp.common : copy_reports_to_controller | Ensure reports directory on controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_reports_to_controller.yml:20 changed: [managed-node01 -> localhost] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/tmp/workdir_controller8yrym8we/ansible_leapp_analysis_logs_2026-03-19_14-41-41", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : copy_reports_to_controller | Fetch report files if they exist] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_reports_to_controller.yml:30 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml for managed-node01 => (item=/var/log/leapp/leapp-report.txt) included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml for managed-node01 => (item=/var/log/leapp/leapp-report.json) included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml for managed-node01 => (item=/var/log/leapp/leapp-preupgrade.log) TASK [infra.leapp.common : fetch_file_if_exists | Check if file exists] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:7 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : fetch_file_if_exists | Copy report file to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:12 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : fetch_file_if_exists | Check if file exists] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:7 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : fetch_file_if_exists | Copy report file to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:12 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : fetch_file_if_exists | Check if file exists] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:7 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : fetch_file_if_exists | Copy report file to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:12 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [copy_reports_to_controller | Copy log file to the controller] ************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_reports_to_controller.yml:39 TASK [infra.leapp.common : copy_archive_leapp_log | Check for log file] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:11 ok: [managed-node01] => {"changed": false, "stat": {"atime": 1773931307.644631, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "8f292c99a9dc14562f6c9acf79c0f0c1a3c22c69", "ctime": 1773931311.1046302, "dev": 51716, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 905970188, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "application/json", "mode": "0644", "mtime": 1773931311.1046302, "nlink": 1, "path": "/var/log/leapp/ansible_leapp_analysis.log", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 284, "uid": 0, "version": "2301403835", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}} TASK [infra.leapp.common : copy_archive_leapp_log | Add end time to log file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:19 changed: [managed-node01] => {"backup": "", "changed": true, "msg": "line added"} TASK [infra.leapp.common : copy_archive_leapp_log | Slurp file /var/log/leapp/ansible_leapp_analysis.log] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:27 ok: [managed-node01] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false} TASK [infra.leapp.common : copy_archive_leapp_log | Decode file /var/log/leapp/ansible_leapp_analysis.log] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:33 ok: [managed-node01] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false} TASK [infra.leapp.common : copy_archive_leapp_log | Ensure reports directory on controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:38 ok: [managed-node01 -> localhost] => {"changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/tmp/workdir_controller8yrym8we/ansible_leapp_analysis_logs_2026-03-19_14-41-41", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : copy_archive_leapp_log | Copy ansible leapp log to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:48 changed: [managed-node01] => {"changed": true, "checksum": "4c0896c4da4298366dcee90b2fb7c926c856139b", "dest": "/tmp/workdir_controller8yrym8we/ansible_leapp_analysis_logs_2026-03-19_14-41-41/managed-node01/ansible_leapp_analysis.log", "md5sum": "f0a82d228dde96b8cf5e3202e53f609d", "remote_checksum": "4c0896c4da4298366dcee90b2fb7c926c856139b", "remote_md5sum": null} TASK [infra.leapp.common : copy_archive_leapp_log | Copy log file to timestamped location] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:54 changed: [managed-node01] => {"changed": true, "checksum": "4c0896c4da4298366dcee90b2fb7c926c856139b", "dest": "/var/log/leapp/ansible_leapp_analysis_2026-03-19_14-41-41.log", "gid": 0, "group": "root", "md5sum": "f0a82d228dde96b8cf5e3202e53f609d", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 318, "src": "/var/log/leapp/ansible_leapp_analysis.log", "state": "file", "uid": 0} TASK [infra.leapp.common : copy_archive_leapp_log | Remove original log file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:66 changed: [managed-node01] => {"changed": true, "path": "/var/log/leapp/ansible_leapp_analysis.log", "state": "absent"} PLAY RECAP ********************************************************************* managed-node01 : ok=38 changed=17 unreachable=0 failed=1 skipped=19 rescued=0 ignored=0 Mar 19 10:41:32 managed-node01 python3[11167]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Mar 19 10:41:33 managed-node01 python3[11345]: ansible-ansible.legacy.dnf Invoked with name=['leapp-upgrade'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Mar 19 10:41:35 managed-node01 python3[11495]: ansible-ansible.legacy.dnf Invoked with name=['python3-dnf-plugin-versionlock'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Mar 19 10:41:36 managed-node01 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. ░░ Subject: A start job for unit run-rec5a695445154bf096833be662cef37b.service has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit run-rec5a695445154bf096833be662cef37b.service has finished successfully. ░░ ░░ The job identifier is 795. Mar 19 10:41:36 managed-node01 systemd[1]: Starting man-db-cache-update.service... ░░ Subject: A start job for unit man-db-cache-update.service has begun execution ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit man-db-cache-update.service has begun execution. ░░ ░░ The job identifier is 874. Mar 19 10:41:36 managed-node01 systemd[1]: man-db-cache-update.service: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit man-db-cache-update.service has successfully entered the 'dead' state. Mar 19 10:41:36 managed-node01 systemd[1]: Finished man-db-cache-update.service. ░░ Subject: A start job for unit man-db-cache-update.service has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit man-db-cache-update.service has finished successfully. ░░ ░░ The job identifier is 874. Mar 19 10:41:36 managed-node01 systemd[1]: run-rec5a695445154bf096833be662cef37b.service: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit run-rec5a695445154bf096833be662cef37b.service has successfully entered the 'dead' state. Mar 19 10:41:37 managed-node01 python3[11726]: ansible-ansible.legacy.command Invoked with _raw_params=dnf versionlock add dracut _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Mar 19 10:41:41 managed-node01 python3[11876]: ansible-ansible.builtin.file Invoked with path=/var/log/leapp state=directory owner=root group=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:41:42 managed-node01 python3[12025]: ansible-ansible.builtin.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 19 10:41:42 managed-node01 python3[12174]: ansible-ansible.legacy.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 19 10:41:43 managed-node01 python3[12294]: ansible-ansible.legacy.copy Invoked with dest=/var/log/leapp/ansible_leapp_analysis.log owner=root group=root mode=0644 src=/root/.ansible/tmp/ansible-tmp-1773931302.3199677-9744-172705783645666/source _original_basename=tmpz_s3oo2u follow=False checksum=291d60f120986a014e6b93d58604fa7367bb7c2b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:41:43 managed-node01 python3[12443]: ansible-ansible.builtin.file Invoked with path=/etc/ansible/facts.d state=directory mode=0755 owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:41:44 managed-node01 python3[12592]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/pre_ipu.fact follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 19 10:41:44 managed-node01 python3[12714]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/pre_ipu.fact mode=0644 owner=root group=root src=/root/.ansible/tmp/ansible-tmp-1773931303.7620137-9852-17060679618345/source _original_basename=tmpu4c02l2r follow=False checksum=1cb12ee832733f8bc6d91d254b9aa733d6617829 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:41:44 managed-node01 python3[12863]: ansible-ansible.legacy.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 19 10:41:45 managed-node01 python3[12985]: ansible-ansible.legacy.copy Invoked with dest=/var/log/leapp/ansible_leapp_analysis.log mode=0644 owner=root group=root src=/root/.ansible/tmp/ansible-tmp-1773931304.6849356-9852-11731134780159/source _original_basename=tmpfp9miqix follow=False checksum=1cb12ee832733f8bc6d91d254b9aa733d6617829 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:41:45 managed-node01 python3[13134]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\.|+]el9' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Mar 19 10:41:46 managed-node01 python3[13288]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/non_rhel_packages.fact follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 19 10:41:46 managed-node01 python3[13363]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/etc/ansible/facts.d/non_rhel_packages.fact _original_basename=tmpfjufi74h recurse=False state=file path=/etc/ansible/facts.d/non_rhel_packages.fact force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:41:47 managed-node01 python3[13512]: ansible-ansible.legacy.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 19 10:41:47 managed-node01 python3[13634]: ansible-ansible.legacy.copy Invoked with dest=/var/log/leapp/ansible_leapp_analysis.log mode=0644 owner=root group=root src=/root/.ansible/tmp/ansible-tmp-1773931307.0333068-10143-2422295405688/source _original_basename=tmpaz0vzqze follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:41:49 managed-node01 python3[13784]: ansible-ansible.builtin.file Invoked with path=/var/log/leapp state=directory owner=root group=root mode=0700 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:41:49 managed-node01 python3[13933]: ansible-ansible.legacy.stat Invoked with path=/var/log/leapp/answerfile follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 19 10:41:49 managed-node01 python3[14055]: ansible-ansible.legacy.copy Invoked with dest=/var/log/leapp/answerfile owner=root group=root mode=0644 src=/root/.ansible/tmp/ansible-tmp-1773931309.27762-10471-227346497135950/source _original_basename=tmpwdm_6m1t follow=False checksum=3d934ad808576e3a7fb4c14a89645a4ad55ccf53 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:41:50 managed-node01 ansible-async_wrapper.py[14227]: Invoked with j632050642122 7200 /root/.ansible/tmp/ansible-tmp-1773931310.157377-10513-231686187714169/AnsiballZ_command.py _ Mar 19 10:41:50 managed-node01 ansible-async_wrapper.py[14230]: Starting module and watcher Mar 19 10:41:50 managed-node01 ansible-async_wrapper.py[14230]: Start watching 14231 (7200) Mar 19 10:41:50 managed-node01 ansible-async_wrapper.py[14231]: Start module (14231) Mar 19 10:41:50 managed-node01 ansible-async_wrapper.py[14227]: Return async_wrapper task started. Mar 19 10:41:51 managed-node01 python3[14232]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=set -o pipefail; export PATH=$PATH; ulimit -n 16384; leapp preupgrade --report-schema=1.2.0 --no-rhsm 2>&1 | tee -a /var/log/leapp/ansible_leapp_analysis.log _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None Mar 19 10:41:51 managed-node01 ansible-async_wrapper.py[14231]: Module complete (14231) Mar 19 10:41:55 managed-node01 ansible-async_wrapper.py[14230]: Done in kid B. Mar 19 10:42:50 managed-node01 sshd[7900]: Received disconnect from 10.31.15.84 port 51796:11: disconnected by user Mar 19 10:42:50 managed-node01 sshd[7900]: Disconnected from user root 10.31.15.84 port 51796 Mar 19 10:42:50 managed-node01 sshd[7897]: pam_unix(sshd:session): session closed for user root Mar 19 10:42:50 managed-node01 systemd-logind[613]: Session 3 logged out. Waiting for processes to exit. Mar 19 10:42:50 managed-node01 systemd[1]: session-3.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-3.scope has successfully entered the 'dead' state. Mar 19 10:42:50 managed-node01 systemd[1]: session-3.scope: Consumed 15.351s CPU time. ░░ Subject: Resources consumed by unit runtime ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-3.scope completed and consumed the indicated resources. Mar 19 10:42:50 managed-node01 systemd-logind[613]: Removed session 3. ░░ Subject: Session 3 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 3 has been terminated. Mar 19 10:42:51 managed-node01 sshd[14237]: Accepted publickey for root from 10.31.15.84 port 54032 ssh2: ECDSA SHA256:5dKg62FZTxyDk+oDA3dCp86Ela2X33u4kD8Rv9RzRYE Mar 19 10:42:51 managed-node01 systemd-logind[613]: New session 5 of user root. ░░ Subject: A new session 5 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 5 has been created for the user root. ░░ ░░ The leading process of the session is 14237. Mar 19 10:42:51 managed-node01 systemd[1]: Started Session 5 of User root. ░░ Subject: A start job for unit session-5.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-5.scope has finished successfully. ░░ ░░ The job identifier is 953. Mar 19 10:42:51 managed-node01 sshd[14237]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Mar 19 10:42:51 managed-node01 python3[14388]: ansible-ansible.legacy.async_status Invoked with jid=j632050642122.14227 mode=status _async_dir=/root/.ansible_async Mar 19 10:42:51 managed-node01 python3[14484]: ansible-ansible.legacy.async_status Invoked with jid=j632050642122.14227 mode=cleanup _async_dir=/root/.ansible_async Mar 19 10:42:52 managed-node01 python3[14633]: ansible-ansible.builtin.stat Invoked with path=/var/log/leapp/leapp-report.txt follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 19 10:42:53 managed-node01 python3[14782]: ansible-ansible.builtin.stat Invoked with path=/var/log/leapp/leapp-report.json follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 19 10:42:53 managed-node01 python3[14931]: ansible-ansible.builtin.stat Invoked with path=/var/log/leapp/leapp-preupgrade.log follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 19 10:42:53 managed-node01 python3[15080]: ansible-ansible.builtin.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Mar 19 10:42:54 managed-node01 python3[15231]: ansible-ansible.builtin.lineinfile Invoked with path=/var/log/leapp/ansible_leapp_analysis.log line=Job ended at 2026-03-19T14:42:53Z owner=root group=root mode=0644 state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:42:55 managed-node01 python3[15529]: ansible-ansible.legacy.stat Invoked with path=/var/log/leapp/ansible_leapp_analysis.log follow=True get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Mar 19 10:42:55 managed-node01 python3[15703]: ansible-ansible.legacy.copy Invoked with src=/var/log/leapp/ansible_leapp_analysis.log dest=/var/log/leapp/ansible_leapp_analysis_2026-03-19_14-41-41.log remote_src=True mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:42:56 managed-node01 python3[15852]: ansible-ansible.builtin.file Invoked with path=/var/log/leapp/ansible_leapp_analysis.log state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Mar 19 10:42:56 managed-node01 sshd[15877]: Accepted publickey for root from 10.31.15.84 port 54036 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Mar 19 10:42:56 managed-node01 systemd-logind[613]: New session 6 of user root. ░░ Subject: A new session 6 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 6 has been created for the user root. ░░ ░░ The leading process of the session is 15877. Mar 19 10:42:56 managed-node01 systemd[1]: Started Session 6 of User root. ░░ Subject: A start job for unit session-6.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-6.scope has finished successfully. ░░ ░░ The job identifier is 1036. Mar 19 10:42:56 managed-node01 sshd[15877]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Mar 19 10:42:56 managed-node01 sshd[15880]: Received disconnect from 10.31.15.84 port 54036:11: disconnected by user Mar 19 10:42:56 managed-node01 sshd[15880]: Disconnected from user root 10.31.15.84 port 54036 Mar 19 10:42:56 managed-node01 sshd[15877]: pam_unix(sshd:session): session closed for user root Mar 19 10:42:56 managed-node01 systemd-logind[613]: Session 6 logged out. Waiting for processes to exit. Mar 19 10:42:56 managed-node01 systemd[1]: session-6.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-6.scope has successfully entered the 'dead' state. Mar 19 10:42:56 managed-node01 systemd-logind[613]: Removed session 6. ░░ Subject: Session 6 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 6 has been terminated. Mar 19 10:42:56 managed-node01 sshd[15905]: Accepted publickey for root from 10.31.15.84 port 54042 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Mar 19 10:42:56 managed-node01 systemd-logind[613]: New session 7 of user root. ░░ Subject: A new session 7 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 7 has been created for the user root. ░░ ░░ The leading process of the session is 15905. Mar 19 10:42:56 managed-node01 systemd[1]: Started Session 7 of User root. ░░ Subject: A start job for unit session-7.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-7.scope has finished successfully. ░░ ░░ The job identifier is 1119. Mar 19 10:42:56 managed-node01 sshd[15905]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0)