[WARNING]: Collection infra.leapp does not support Ansible version 2.14.18 [WARNING]: running playbook inside collection infra.leapp ansible-playbook [core 2.14.18] config file = /etc/ansible/ansible.cfg configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python3.9/site-packages/ansible ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections executable location = /usr/bin/ansible-playbook python version = 3.9.25 (main, Mar 9 2026, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-14)] (/usr/bin/python3) jinja version = 3.1.2 libyaml = True Using /etc/ansible/ansible.cfg as config file Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_hostvars.yml *************************************************** 1 plays in /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml PLAY [Test how analysis role generates hostvars file when it exists] *********** TASK [Gathering Facts] ********************************************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:2 ok: [managed-node01] TASK [Сommon_upgrade_tasks | Remove leapp packages] **************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:12 changed: [managed-node01] => {"changed": true, "changes": {"removed": ["leapp-upgrade"]}, "msg": "", "rc": 0, "results": ["Loaded plugins: product-id, search-disabled-repos, subscription-manager\n\nThis system is not registered with an entitlement server. You can use subscription-manager to register.\n\nResolving Dependencies\n--> Running transaction check\n---> Package leapp-upgrade-el7toel8.noarch 0:0.20.0-9.el7_9 will be erased\n--> Processing Dependency: leapp-repository for package: leapp-0.17.0-2.el7_9.noarch\n--> Running transaction check\n---> Package leapp.noarch 0:0.17.0-2.el7_9 will be erased\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository Size\n================================================================================\nRemoving:\n leapp-upgrade-el7toel8 noarch 0.20.0-9.el7_9 @rhel-7-server-extras-rpms 9.7 M\nRemoving for dependencies:\n leapp noarch 0.17.0-2.el7_9 @rhel-7-server-extras-rpms 62 k\n\nTransaction Summary\n================================================================================\nRemove 1 Package (+1 Dependent package)\n\nInstalled size: 9.7 M\nDownloading packages:\nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Erasing : leapp-upgrade-el7toel8-0.20.0-9.el7_9.noarch 1/2 \n Erasing : leapp-0.17.0-2.el7_9.noarch 2/2 \n Verifying : leapp-0.17.0-2.el7_9.noarch 1/2 \n Verifying : leapp-upgrade-el7toel8-0.20.0-9.el7_9.noarch 2/2 \n\nRemoved:\n leapp-upgrade-el7toel8.noarch 0:0.20.0-9.el7_9 \n\nDependency Removed:\n leapp.noarch 0:0.17.0-2.el7_9 \n\nComplete!\n"]} TASK [Сommon_upgrade_tasks | Gather setup tasks] ******************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:17 ok: [managed-node01 -> localhost] => {"changed": false, "examined": 4, "files": [{"atime": 1773834327.5701265, "ctime": 1773834327.3941276, "dev": 51716, "gid": 0, "gr_name": "root", "inode": 746586270, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773834327.3941276, "nlink": 1, "path": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_cifs.yml", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 272, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1773834327.5701265, "ctime": 1773834327.3941276, "dev": 51716, "gid": 0, "gr_name": "root", "inode": 746586271, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773834327.3941276, "nlink": 1, "path": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_remote_using_root.yml", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 268, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1773834327.5701265, "ctime": 1773834327.3941276, "dev": 51716, "gid": 0, "gr_name": "root", "inode": 746586272, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773834327.3941276, "nlink": 1, "path": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_removed_kernel_drivers.yml", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 913, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}, {"atime": 1773834327.5701265, "ctime": 1773834327.3941276, "dev": 51716, "gid": 0, "gr_name": "root", "inode": 746586273, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mode": "0644", "mtime": 1773834327.3941276, "nlink": 1, "path": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/version_lock.yml", "pw_name": "root", "rgrp": true, "roth": true, "rusr": true, "size": 548, "uid": 0, "wgrp": false, "woth": false, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}], "matched": 4, "msg": "All paths examined", "skipped_paths": {}} TASK [Сommon_upgrade_tasks | Do remediation setup tasks] *********************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:26 skipping: [managed-node01] => (item=/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_cifs.yml) => {"ansible_loop_var": "setup_task_file", "changed": false, "setup_task_file": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_cifs.yml", "skip_reason": "Conditional result was False"} skipping: [managed-node01] => (item=/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_remote_using_root.yml) => {"ansible_loop_var": "setup_task_file", "changed": false, "setup_task_file": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_remote_using_root.yml", "skip_reason": "Conditional result was False"} skipping: [managed-node01] => (item=/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_removed_kernel_drivers.yml) => {"ansible_loop_var": "setup_task_file", "changed": false, "setup_task_file": "/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/remediate_removed_kernel_drivers.yml", "skip_reason": "Conditional result was False"} skipping: [managed-node01] => {"changed": false, "msg": "All items skipped"} TASK [Сommon_upgrade_tasks | Do setup tasks] *********************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:37 included: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/version_lock.yml for managed-node01 => (item=/root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/version_lock.yml) TASK [setup | version_lock | Install versionlock module] *********************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/version_lock.yml:2 changed: [managed-node01] => {"changed": true, "changes": {"installed": ["yum-plugin-versionlock"]}, "msg": "", "rc": 0, "results": ["Loaded plugins: product-id, search-disabled-repos, subscription-manager\n\nThis system is not registered with an entitlement server. You can use subscription-manager to register.\n\nResolving Dependencies\n--> Running transaction check\n---> Package yum-plugin-versionlock.noarch 0:1.1.31-54.el7_8 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository\n Size\n================================================================================\nInstalling:\n yum-plugin-versionlock noarch 1.1.31-54.el7_8 rhel 36 k\n\nTransaction Summary\n================================================================================\nInstall 1 Package\n\nTotal download size: 36 k\nInstalled size: 53 k\nDownloading packages:\nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : yum-plugin-versionlock-1.1.31-54.el7_8.noarch 1/1 \n Verifying : yum-plugin-versionlock-1.1.31-54.el7_8.noarch 1/1 \n\nInstalled:\n yum-plugin-versionlock.noarch 0:1.1.31-54.el7_8 \n\nComplete!\n"]} TASK [setup | version_lock | Version lock the dracut package] ****************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tasks/setup/version_lock.yml:7 changed: [managed-node01] => {"changed": true, "cmd": ["yum", "versionlock", "add", "dracut"], "delta": "0:00:00.263914", "end": "2026-03-18 07:52:14.191198", "msg": "", "rc": 0, "start": "2026-03-18 07:52:13.927284", "stderr": "", "stderr_lines": [], "stdout": "Loaded plugins: product-id, search-disabled-repos, subscription-manager,\n : versionlock\n\nThis system is not registered with an entitlement server. You can use subscription-manager to register.\n\nAdding versionlock on: 0:dracut-033-572.el7\nversionlock added: 1", "stdout_lines": ["Loaded plugins: product-id, search-disabled-repos, subscription-manager,", " : versionlock", "", "This system is not registered with an entitlement server. You can use subscription-manager to register.", "", "Adding versionlock on: 0:dracut-033-572.el7", "versionlock added: 1"]} TASK [Test | Create tempdir directory for workdir controller] ****************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:53 changed: [managed-node01 -> localhost] => {"changed": true, "gid": 0, "group": "root", "mode": "0700", "owner": "root", "path": "/tmp/workdir_controllervqwwyk3s", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [Test | Prepare hostvars directory] *************************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:59 changed: [managed-node01 -> localhost] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/tmp/workdir_controllervqwwyk3s/host_vars", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [Test | Set fact with hostvars file] ************************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:65 ok: [managed-node01 -> localhost] => {"ansible_facts": {"leapp_hostvars_file": "/tmp/workdir_controllervqwwyk3s/host_vars/managed-node01.yml"}, "changed": false} TASK [Test | Write a sample hostvars file] ************************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:69 changed: [managed-node01 -> localhost] => {"changed": true, "checksum": "165076eb6b9fac25e39014723e3cdb9b570bce81", "dest": "/tmp/workdir_controllervqwwyk3s/host_vars/managed-node01.yml", "gid": 0, "group": "root", "md5sum": "b94177b40d078db8db5b3fcf744d19b1", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:admin_home_t:s0", "size": 221, "src": "/root/.ansible/tmp/ansible-tmp-1773834735.645473-9319-206772201848487/source", "state": "file", "uid": 0} TASK [Test | Print hostvars file content] ************************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:87 ok: [managed-node01 -> localhost] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false} TASK [Test | Run role analysis] ************************************************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/tests/tests_hostvars.yml:93 TASK [infra.leapp.analysis : Lock timestamped variables] *********************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:5 ok: [managed-node01] => {"ansible_facts": {"__leapp_timestamp": "2026-03-18_11-52-17"}, "changed": false} TASK [Initialize lock, logging, and common vars] ******************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:9 TASK [infra.leapp.common : init_leapp_log | Ensure that log directory exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:9 ok: [managed-node01] => {"changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/var/log/leapp", "secontext": "system_u:object_r:var_log_t:s0", "size": 69, "state": "directory", "uid": 0} TASK [infra.leapp.common : init_leapp_log | Check for existing log file] ******* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:17 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : init_leapp_log | Fail if log file already exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:22 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : init_leapp_log | Create new log file] *************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:31 changed: [managed-node01] => {"changed": true, "checksum": "188a9b8944b35fb247328a17aac1ef25b2fc04f3", "dest": "/var/log/leapp/ansible_leapp_analysis.log", "gid": 0, "group": "root", "md5sum": "17c6da43d24e3c0dc186e086ab2b588a", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 70, "src": "/root/.ansible/tmp/ansible-tmp-1773834738.0576887-9529-5342301259653/source", "state": "file", "uid": 0} TASK [infra.leapp.common : init_leapp_log | /etc/ansible/facts.d directory exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:41 ok: [managed-node01] => {"changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/ansible/facts.d", "secontext": "unconfined_u:object_r:etc_t:s0", "size": 56, "state": "directory", "uid": 0} TASK [infra.leapp.common : init_leapp_log | Capture current ansible_facts for validation after upgrade] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:49 changed: [managed-node01] => (item=/etc/ansible/facts.d/pre_ipu.fact) => {"ansible_loop_var": "item", "changed": true, "checksum": "b0d6a16e6c4212e59ad3179dfdeff540b6f2a7b3", "dest": "/etc/ansible/facts.d/pre_ipu.fact", "gid": 0, "group": "root", "item": "/etc/ansible/facts.d/pre_ipu.fact", "md5sum": "39231923ebf0fe3049fb091d5f6aac1b", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 11997, "src": "/root/.ansible/tmp/ansible-tmp-1773834738.9640179-9685-152558004720111/source", "state": "file", "uid": 0} changed: [managed-node01] => (item=/var/log/leapp/ansible_leapp_analysis.log) => {"ansible_loop_var": "item", "changed": true, "checksum": "b0d6a16e6c4212e59ad3179dfdeff540b6f2a7b3", "dest": "/var/log/leapp/ansible_leapp_analysis.log", "gid": 0, "group": "root", "item": "/var/log/leapp/ansible_leapp_analysis.log", "md5sum": "39231923ebf0fe3049fb091d5f6aac1b", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 11997, "src": "/root/.ansible/tmp/ansible-tmp-1773834739.6396976-9685-137812043868881/source", "state": "file", "uid": 0} TASK [infra.leapp.common : init_leapp_log | Capture a list of non-rhel versioned packages] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:60 ok: [managed-node01] => {"changed": false, "cmd": "set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\\.|+]el7' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort", "delta": "0:00:00.347001", "end": "2026-03-18 07:52:20.844604", "failed_when_result": false, "msg": "", "rc": 0, "start": "2026-03-18 07:52:20.497603", "stderr": "", "stderr_lines": [], "stdout": "epel-release-7-14.noarch\ntps-devel-2.44.50-1.noarch", "stdout_lines": ["epel-release-7-14.noarch", "tps-devel-2.44.50-1.noarch"]} TASK [infra.leapp.common : init_leapp_log | Create fact with the non-rhel versioned packages list] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:74 ok: [managed-node01] => {"ansible_facts": {"non_rhel_packages": ["epel-release-7-14.noarch", "tps-devel-2.44.50-1.noarch"]}, "changed": false} TASK [infra.leapp.common : init_leapp_log | Capture the list of non-rhel versioned packages in a separate fact file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/init_leapp_log.yml:78 ok: [managed-node01] => (item=/etc/ansible/facts.d/non_rhel_packages.fact) => {"ansible_loop_var": "item", "changed": false, "checksum": "6d36b22d9c2b2f366fc090edfbac427c77d524a5", "dest": "/etc/ansible/facts.d/non_rhel_packages.fact", "gid": 0, "group": "root", "item": "/etc/ansible/facts.d/non_rhel_packages.fact", "mode": "0644", "owner": "root", "path": "/etc/ansible/facts.d/non_rhel_packages.fact", "secontext": "system_u:object_r:etc_t:s0", "size": 58, "state": "file", "uid": 0} changed: [managed-node01] => (item=/var/log/leapp/ansible_leapp_analysis.log) => {"ansible_loop_var": "item", "changed": true, "checksum": "6d36b22d9c2b2f366fc090edfbac427c77d524a5", "dest": "/var/log/leapp/ansible_leapp_analysis.log", "gid": 0, "group": "root", "item": "/var/log/leapp/ansible_leapp_analysis.log", "md5sum": "a7d4e8abcc28ebc36ca5401fee060144", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 58, "src": "/root/.ansible/tmp/ansible-tmp-1773834741.570873-9868-243867616517692/source", "state": "file", "uid": 0} TASK [infra.leapp.analysis : Include tasks for preupg assistant analysis] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:19 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : Include tasks for leapp preupgrade analysis] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:23 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml for managed-node01 TASK [analysis-leapp | Include pre_upgrade.yml] ******************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:4 TASK [infra.leapp.common : pre_upgrade | Register with Satellite activation key] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade.yml:3 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [pre_upgrade | Include custom_local_repos for local_repos_pre_leapp] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade.yml:10 TASK [infra.leapp.common : custom_local_repos | Remove old /etc/leapp/files/leapp_upgrade_repositories.repo] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:2 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : custom_local_repos | Validate repo definitions have baseurl or metalink] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:9 skipping: [managed-node01] => {"changed": false, "skipped_reason": "No items in the list"} TASK [infra.leapp.common : custom_local_repos | Enable custom upgrade yum repositories] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:16 skipping: [managed-node01] => {"changed": false, "skipped_reason": "No items in the list"} TASK [infra.leapp.common : pre_upgrade | Get package version lock entries] ***** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade.yml:21 ok: [managed-node01] => {"changed": false, "cmd": ["yum", "versionlock", "list"], "delta": "0:00:00.263073", "end": "2026-03-18 07:52:23.003283", "failed_when_result": false, "msg": "", "rc": 0, "start": "2026-03-18 07:52:22.740210", "stderr": "", "stderr_lines": [], "stdout": "Loaded plugins: product-id, search-disabled-repos, subscription-manager,\n : versionlock\n\nThis system is not registered with an entitlement server. You can use subscription-manager to register.\n\n0:dracut-033-572.el7.*\nversionlock list done", "stdout_lines": ["Loaded plugins: product-id, search-disabled-repos, subscription-manager,", " : versionlock", "", "This system is not registered with an entitlement server. You can use subscription-manager to register.", "", "0:dracut-033-572.el7.*", "versionlock list done"]} TASK [infra.leapp.common : pre_upgrade | Remove all package version locks] ***** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade.yml:28 changed: [managed-node01] => {"changed": true, "cmd": ["yum", "versionlock", "clear"], "delta": "0:00:00.259220", "end": "2026-03-18 07:52:23.563085", "msg": "", "rc": 0, "start": "2026-03-18 07:52:23.303865", "stderr": "", "stderr_lines": [], "stdout": "Loaded plugins: product-id, search-disabled-repos, subscription-manager,\n : versionlock\n\nThis system is not registered with an entitlement server. You can use subscription-manager to register.\n\nversionlock cleared", "stdout_lines": ["Loaded plugins: product-id, search-disabled-repos, subscription-manager,", " : versionlock", "", "This system is not registered with an entitlement server. You can use subscription-manager to register.", "", "versionlock cleared"]} TASK [infra.leapp.common : pre_upgrade | Install packages for upgrade from RHEL 7] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade.yml:36 changed: [managed-node01] => {"changed": true, "changes": {"installed": ["leapp-upgrade"], "updated": []}, "msg": "", "rc": 0, "results": ["Loaded plugins: product-id, search-disabled-repos, subscription-manager,\n : versionlock\n\nThis system is not registered with an entitlement server. You can use subscription-manager to register.\n\nResolving Dependencies\n--> Running transaction check\n---> Package leapp-upgrade-el7toel8.noarch 0:0.20.0-9.el7_9 will be installed\n--> Processing Dependency: leapp for package: leapp-upgrade-el7toel8-0.20.0-9.el7_9.noarch\n--> Running transaction check\n---> Package leapp.noarch 0:0.17.0-2.el7_9 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository Size\n================================================================================\nInstalling:\n leapp-upgrade-el7toel8 noarch 0.20.0-9.el7_9 rhel-7-server-extras-rpms 1.2 M\nInstalling for dependencies:\n leapp noarch 0.17.0-2.el7_9 rhel-7-server-extras-rpms 29 k\n\nTransaction Summary\n================================================================================\nInstall 1 Package (+1 Dependent package)\n\nTotal download size: 1.3 M\nInstalled size: 9.7 M\nDownloading packages:\n--------------------------------------------------------------------------------\nTotal 8.1 MB/s | 1.3 MB 00:00 \nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : leapp-upgrade-el7toel8-0.20.0-9.el7_9.noarch 1/2 \n Installing : leapp-0.17.0-2.el7_9.noarch 2/2 \n Verifying : leapp-0.17.0-2.el7_9.noarch 1/2 \n Verifying : leapp-upgrade-el7toel8-0.20.0-9.el7_9.noarch 2/2 \n\nInstalled:\n leapp-upgrade-el7toel8.noarch 0:0.20.0-9.el7_9 \n\nDependency Installed:\n leapp.noarch 0:0.17.0-2.el7_9 \n\nComplete!\n"]} TASK [infra.leapp.common : pre_upgrade | Include update-and-reboot.yml] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade.yml:45 fatal: [managed-node01]: FAILED! => {"msg": "The conditional check 'leapp_pre_upgrade_update | bool' failed. The error was: error while evaluating conditional (leapp_pre_upgrade_update | bool): 'leapp_pre_upgrade_update' is undefined. 'leapp_pre_upgrade_update' is undefined\n\nThe error appears to be in '/root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/pre_upgrade.yml': line 45, column 7, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n - name: pre_upgrade | Include update-and-reboot.yml\n ^ here\n"} TASK [analysis-leapp | Include custom_local_repos for local_repos_post_analysis] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:70 TASK [infra.leapp.common : custom_local_repos | Remove old /etc/leapp/files/leapp_upgrade_repositories.repo] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:2 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : custom_local_repos | Validate repo definitions have baseurl or metalink] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:9 skipping: [managed-node01] => {"changed": false, "skipped_reason": "No items in the list"} TASK [infra.leapp.common : custom_local_repos | Enable custom upgrade yum repositories] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/custom_local_repos.yml:16 skipping: [managed-node01] => {"changed": false, "skipped_reason": "No items in the list"} TASK [analysis-leapp | Restore original Satellite activation key] ************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:80 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [analysis-leapp | Copy reports to the controller] ************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:91 TASK [infra.leapp.common : copy_reports_to_controller | Ensure reports directory on controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_reports_to_controller.yml:20 changed: [managed-node01 -> localhost] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/tmp/workdir_controllervqwwyk3s/ansible_leapp_analysis_logs_2026-03-18_11-52-17", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : copy_reports_to_controller | Fetch report files if they exist] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_reports_to_controller.yml:30 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml for managed-node01 => (item=/var/log/leapp/leapp-report.txt) included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml for managed-node01 => (item=/var/log/leapp/leapp-report.json) included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml for managed-node01 => (item=/var/log/leapp/leapp-preupgrade.log) TASK [infra.leapp.common : fetch_file_if_exists | Check if file exists] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:7 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : fetch_file_if_exists | Copy report file to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:12 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : fetch_file_if_exists | Check if file exists] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:7 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : fetch_file_if_exists | Copy report file to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:12 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : fetch_file_if_exists | Check if file exists] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:7 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : fetch_file_if_exists | Copy report file to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/fetch_file_if_exists.yml:12 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [copy_reports_to_controller | Copy log file to the controller] ************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_reports_to_controller.yml:39 TASK [infra.leapp.common : copy_archive_leapp_log | Check for log file] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:11 ok: [managed-node01] => {"changed": false, "stat": {"atime": 1773834742.0127416, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "6d36b22d9c2b2f366fc090edfbac427c77d524a5", "ctime": 1773834742.0137415, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 117440610, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773834741.8207405, "nlink": 1, "path": "/var/log/leapp/ansible_leapp_analysis.log", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 58, "uid": 0, "version": "480616321", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}} TASK [infra.leapp.common : copy_archive_leapp_log | Add end time to log file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:19 changed: [managed-node01] => {"backup": "", "changed": true, "msg": "line added"} TASK [infra.leapp.common : copy_archive_leapp_log | Slurp file /var/log/leapp/ansible_leapp_analysis.log] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:27 ok: [managed-node01] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false} TASK [infra.leapp.common : copy_archive_leapp_log | Decode file /var/log/leapp/ansible_leapp_analysis.log] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:33 ok: [managed-node01] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false} TASK [infra.leapp.common : copy_archive_leapp_log | Ensure reports directory on controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:38 ok: [managed-node01 -> localhost] => {"changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/tmp/workdir_controllervqwwyk3s/ansible_leapp_analysis_logs_2026-03-18_11-52-17", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : copy_archive_leapp_log | Copy ansible leapp log to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:48 changed: [managed-node01] => {"changed": true, "checksum": "f993af1b64c5ac289eed247d77887fe72ec5939c", "dest": "/tmp/workdir_controllervqwwyk3s/ansible_leapp_analysis_logs_2026-03-18_11-52-17/managed-node01/ansible_leapp_analysis.log", "md5sum": "e834634ac6e5ce38355a470da20389bb", "remote_checksum": "f993af1b64c5ac289eed247d77887fe72ec5939c", "remote_md5sum": null} TASK [infra.leapp.common : copy_archive_leapp_log | Copy log file to timestamped location] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:54 changed: [managed-node01] => {"changed": true, "checksum": "f993af1b64c5ac289eed247d77887fe72ec5939c", "dest": "/var/log/leapp/ansible_leapp_analysis_2026-03-18_11-52-17.log", "gid": 0, "group": "root", "md5sum": "e834634ac6e5ce38355a470da20389bb", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 93, "src": "/var/log/leapp/ansible_leapp_analysis.log", "state": "file", "uid": 0} TASK [infra.leapp.common : copy_archive_leapp_log | Remove original log file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/copy_archive_leapp_log.yml:66 changed: [managed-node01] => {"changed": true, "path": "/var/log/leapp/ansible_leapp_analysis.log", "state": "absent"} PLAY RECAP ********************************************************************* managed-node01 : ok=39 changed=16 unreachable=0 failed=1 skipped=14 rescued=0 ignored=0 -- Logs begin at Wed 2026-03-18 07:48:58 EDT, end at Wed 2026-03-18 07:52:32 EDT. -- Mar 18 07:52:07 managed-node01 ansible-ansible.legacy.setup[13313]: Invoked with filter=[] gather_subset=['all'] fact_path=/etc/ansible/facts.d gather_timeout=10 Mar 18 07:52:08 managed-node01 ansible-ansible.legacy.yum[13404]: Invoked with lock_timeout=30 update_cache=False conf_file=None exclude=[] allow_downgrade=False sslverify=True disable_gpg_check=False disable_excludes=None use_backend=auto validate_certs=True state=absent disablerepo=[] skip_broken=False releasever=None cacheonly=False autoremove=False download_dir=None installroot=/ install_weak_deps=True name=['leapp-upgrade'] download_only=False bugfix=False list=None install_repoquery=True update_only=False disable_plugin=[] enablerepo=[] security=False enable_plugin=[] Mar 18 07:52:10 managed-node01 yum[13415]: Erased: leapp-upgrade-el7toel8-0.20.0-9.el7_9.noarch Mar 18 07:52:10 managed-node01 yum[13415]: Erased: leapp-0.17.0-2.el7_9.noarch Mar 18 07:52:11 managed-node01 ansible-ansible.legacy.yum[13490]: Invoked with lock_timeout=30 update_cache=False conf_file=None exclude=[] allow_downgrade=False sslverify=True disable_gpg_check=False disable_excludes=None use_backend=auto validate_certs=True state=present disablerepo=[] skip_broken=False releasever=None cacheonly=False autoremove=False download_dir=None installroot=/ install_weak_deps=True name=['yum-plugin-versionlock'] download_only=False bugfix=False list=None install_repoquery=True update_only=False disable_plugin=[] enablerepo=[] security=False enable_plugin=[] Mar 18 07:52:13 managed-node01 yum[13502]: Installed: yum-plugin-versionlock-1.1.31-54.el7_8.noarch Mar 18 07:52:13 managed-node01 ansible-ansible.legacy.command[13577]: Invoked with executable=None _uses_shell=False strip_empty_ends=True _raw_params=yum versionlock add dracut removes=None argv=None creates=None chdir=None stdin_add_newline=True stdin=None Mar 18 07:52:17 managed-node01 ansible-ansible.builtin.file[13652]: Invoked with src=None selevel=None force=False setype=None _original_basename=None unsafe_writes=False access_time=None seuser=None recurse=False state=directory access_time_format=%Y%m%d%H%M.%S group=root modification_time=None serole=None _diff_peek=None modification_time_format=%Y%m%d%H%M.%S path=/var/log/leapp owner=root follow=True attributes=None mode=0755 Mar 18 07:52:17 managed-node01 ansible-ansible.builtin.stat[13713]: Invoked with checksum_algorithm=sha1 get_checksum=True follow=False path=/var/log/leapp/ansible_leapp_analysis.log get_md5=False get_mime=True get_attributes=True Mar 18 07:52:18 managed-node01 ansible-ansible.legacy.stat[13774]: Invoked with checksum_algorithm=sha1 get_checksum=True path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_md5=False get_mime=True get_attributes=True Mar 18 07:52:18 managed-node01 ansible-ansible.legacy.copy[13820]: Invoked with src=/root/.ansible/tmp/ansible-tmp-1773834738.0576887-9529-5342301259653/source directory_mode=None force=True attributes=None remote_src=None unsafe_writes=False dest=/var/log/leapp/ansible_leapp_analysis.log seuser=None setype=None group=root content=NOT_LOGGING_PARAMETER _original_basename=tmpjf5ol40t serole=None mode=0644 selevel=None owner=root follow=False validate=None checksum=188a9b8944b35fb247328a17aac1ef25b2fc04f3 backup=False local_follow=None Mar 18 07:52:18 managed-node01 ansible-ansible.builtin.file[13882]: Invoked with src=None selevel=None force=False setype=None _original_basename=None unsafe_writes=False access_time=None seuser=None recurse=False state=directory access_time_format=%Y%m%d%H%M.%S group=root modification_time=None serole=None _diff_peek=None modification_time_format=%Y%m%d%H%M.%S path=/etc/ansible/facts.d owner=root follow=True attributes=None mode=0755 Mar 18 07:52:19 managed-node01 ansible-ansible.legacy.stat[13943]: Invoked with checksum_algorithm=sha1 get_checksum=True path=/etc/ansible/facts.d/pre_ipu.fact follow=False get_md5=False get_mime=True get_attributes=True Mar 18 07:52:19 managed-node01 ansible-ansible.legacy.copy[13991]: Invoked with src=/root/.ansible/tmp/ansible-tmp-1773834738.9640179-9685-152558004720111/source directory_mode=None force=True attributes=None remote_src=None unsafe_writes=False dest=/etc/ansible/facts.d/pre_ipu.fact seuser=None setype=None group=root content=NOT_LOGGING_PARAMETER _original_basename=tmp1ozlpf72 serole=None mode=0644 selevel=None owner=root follow=False validate=None checksum=b0d6a16e6c4212e59ad3179dfdeff540b6f2a7b3 backup=False local_follow=None Mar 18 07:52:19 managed-node01 ansible-ansible.legacy.stat[14052]: Invoked with checksum_algorithm=sha1 get_checksum=True path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_md5=False get_mime=True get_attributes=True Mar 18 07:52:20 managed-node01 ansible-ansible.legacy.copy[14101]: Invoked with src=/root/.ansible/tmp/ansible-tmp-1773834739.6396976-9685-137812043868881/source directory_mode=None force=True attributes=None remote_src=None unsafe_writes=False dest=/var/log/leapp/ansible_leapp_analysis.log seuser=None setype=None group=root content=NOT_LOGGING_PARAMETER _original_basename=tmpp7mfzdxw serole=None mode=0644 selevel=None owner=root follow=False validate=None checksum=b0d6a16e6c4212e59ad3179dfdeff540b6f2a7b3 backup=False local_follow=None Mar 18 07:52:20 managed-node01 ansible-ansible.legacy.command[14162]: Invoked with executable=None _uses_shell=True strip_empty_ends=True _raw_params=set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\.|+]el7' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort removes=None argv=None creates=None chdir=None stdin_add_newline=True stdin=None Mar 18 07:52:21 managed-node01 ansible-ansible.legacy.stat[14229]: Invoked with checksum_algorithm=sha1 get_checksum=True path=/etc/ansible/facts.d/non_rhel_packages.fact follow=False get_md5=False get_mime=True get_attributes=True Mar 18 07:52:21 managed-node01 ansible-ansible.legacy.file[14261]: Invoked with force=False _original_basename=tmpnh_bwxs7 owner=root follow=True group=root unsafe_writes=False serole=None state=file selevel=None setype=None dest=/etc/ansible/facts.d/non_rhel_packages.fact access_time=None access_time_format=%Y%m%d%H%M.%S modification_time=None path=/etc/ansible/facts.d/non_rhel_packages.fact src=None seuser=None recurse=False _diff_peek=None mode=0644 modification_time_format=%Y%m%d%H%M.%S attributes=None Mar 18 07:52:21 managed-node01 ansible-ansible.legacy.stat[14322]: Invoked with checksum_algorithm=sha1 get_checksum=True path=/var/log/leapp/ansible_leapp_analysis.log follow=False get_md5=False get_mime=True get_attributes=True Mar 18 07:52:22 managed-node01 ansible-ansible.legacy.copy[14370]: Invoked with src=/root/.ansible/tmp/ansible-tmp-1773834741.570873-9868-243867616517692/source directory_mode=None force=True attributes=None remote_src=None unsafe_writes=False dest=/var/log/leapp/ansible_leapp_analysis.log seuser=None setype=None group=root content=NOT_LOGGING_PARAMETER _original_basename=tmpvq674lpv serole=None mode=0644 selevel=None owner=root follow=False validate=None checksum=6d36b22d9c2b2f366fc090edfbac427c77d524a5 backup=False local_follow=None Mar 18 07:52:22 managed-node01 ansible-ansible.legacy.command[14432]: Invoked with executable=None _uses_shell=False strip_empty_ends=True _raw_params=yum versionlock list removes=None argv=None creates=None chdir=None stdin_add_newline=True stdin=None Mar 18 07:52:23 managed-node01 ansible-ansible.legacy.command[14504]: Invoked with executable=None _uses_shell=False strip_empty_ends=True _raw_params=yum versionlock clear removes=None argv=None creates=None chdir=None stdin_add_newline=True stdin=None Mar 18 07:52:24 managed-node01 ansible-ansible.legacy.yum[14577]: Invoked with lock_timeout=30 update_cache=False conf_file=None exclude=[] allow_downgrade=False sslverify=True disable_gpg_check=False disable_excludes=None use_backend=auto validate_certs=True state=latest disablerepo=[] releasever=None skip_broken=False cacheonly=False autoremove=False download_dir=None installroot=/ install_weak_deps=True name=['leapp-upgrade'] download_only=False bugfix=False list=None install_repoquery=True update_only=False disable_plugin=[] enablerepo=['rhel-7-server-extras-rpms'] security=False enable_plugin=[] Mar 18 07:52:26 managed-node01 yum[14600]: Installed: leapp-upgrade-el7toel8-0.20.0-9.el7_9.noarch Mar 18 07:52:26 managed-node01 yum[14600]: Installed: leapp-0.17.0-2.el7_9.noarch Mar 18 07:52:27 managed-node01 ansible-ansible.builtin.stat[14678]: Invoked with checksum_algorithm=sha1 get_checksum=True follow=False path=/var/log/leapp/leapp-report.txt get_md5=False get_mime=True get_attributes=True Mar 18 07:52:28 managed-node01 ansible-ansible.builtin.stat[14739]: Invoked with checksum_algorithm=sha1 get_checksum=True follow=False path=/var/log/leapp/leapp-report.json get_md5=False get_mime=True get_attributes=True Mar 18 07:52:28 managed-node01 ansible-ansible.builtin.stat[14801]: Invoked with checksum_algorithm=sha1 get_checksum=True follow=False path=/var/log/leapp/leapp-preupgrade.log get_md5=False get_mime=True get_attributes=True Mar 18 07:52:29 managed-node01 ansible-ansible.builtin.stat[14862]: Invoked with checksum_algorithm=sha1 get_checksum=True follow=False path=/var/log/leapp/ansible_leapp_analysis.log get_md5=False get_mime=True get_attributes=True Mar 18 07:52:29 managed-node01 ansible-ansible.builtin.lineinfile[14925]: Invoked with group=root insertbefore=None unsafe_writes=False selevel=None create=False seuser=None serole=None backrefs=False search_string=None state=present firstmatch=False mode=0644 insertafter=None path=/var/log/leapp/ansible_leapp_analysis.log owner=root regexp=None line=Job ended at 2026-03-18T11:52:29Z attributes=None backup=False validate=None setype=None Mar 18 07:52:30 managed-node01 ansible-ansible.legacy.stat[15048]: Invoked with checksum_algorithm=sha1 get_checksum=True path=/var/log/leapp/ansible_leapp_analysis.log follow=True get_md5=False get_mime=True get_attributes=True Mar 18 07:52:31 managed-node01 ansible-ansible.legacy.copy[15119]: Invoked with src=/var/log/leapp/ansible_leapp_analysis.log directory_mode=None force=True unsafe_writes=False remote_src=True dest=/var/log/leapp/ansible_leapp_analysis_2026-03-18_11-52-17.log selevel=None seuser=None setype=None group=None content=NOT_LOGGING_PARAMETER _original_basename=None serole=None mode=preserve checksum=None owner=None follow=False validate=None attributes=None backup=False local_follow=None Mar 18 07:52:31 managed-node01 ansible-ansible.builtin.file[15180]: Invoked with src=None selevel=None force=False setype=None _original_basename=None unsafe_writes=False access_time=None seuser=None recurse=False state=absent access_time_format=%Y%m%d%H%M.%S group=None modification_time=None serole=None _diff_peek=None modification_time_format=%Y%m%d%H%M.%S path=/var/log/leapp/ansible_leapp_analysis.log owner=None follow=True attributes=None mode=None Mar 18 07:52:31 managed-node01 sshd[15191]: Accepted publickey for root from 10.31.15.4 port 56670 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Mar 18 07:52:32 managed-node01 systemd-logind[563]: New session 4 of user root. -- Subject: A new session 4 has been created for user root -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- Documentation: http://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 4 has been created for the user root. -- -- The leading process of the session is 15191. Mar 18 07:52:32 managed-node01 systemd[1]: Started Session 4 of user root. -- Subject: Unit session-4.scope has finished start-up -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- -- Unit session-4.scope has finished starting up. -- -- The start-up result is done. Mar 18 07:52:32 managed-node01 sshd[15191]: pam_unix(sshd:session): session opened for user root by (uid=0) Mar 18 07:52:32 managed-node01 sshd[15191]: Received disconnect from 10.31.15.4 port 56670:11: disconnected by user Mar 18 07:52:32 managed-node01 sshd[15191]: Disconnected from 10.31.15.4 port 56670 Mar 18 07:52:32 managed-node01 sshd[15191]: pam_unix(sshd:session): session closed for user root Mar 18 07:52:32 managed-node01 systemd-logind[563]: Removed session 4. -- Subject: Session 4 has been terminated -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- Documentation: http://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 4 has been terminated. Mar 18 07:52:32 managed-node01 sshd[15204]: Accepted publickey for root from 10.31.15.4 port 56684 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Mar 18 07:52:32 managed-node01 systemd-logind[563]: New session 5 of user root. -- Subject: A new session 5 has been created for user root -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- Documentation: http://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 5 has been created for the user root. -- -- The leading process of the session is 15204. Mar 18 07:52:32 managed-node01 systemd[1]: Started Session 5 of user root. -- Subject: Unit session-5.scope has finished start-up -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- -- Unit session-5.scope has finished starting up. -- -- The start-up result is done. Mar 18 07:52:32 managed-node01 sshd[15204]: pam_unix(sshd:session): session opened for user root by (uid=0)