[WARNING]: Collection infra.leapp does not support Ansible version 2.14.18 [WARNING]: running playbook inside collection infra.leapp ansible-playbook [core 2.14.18] config file = /etc/ansible/ansible.cfg configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python3.9/site-packages/ansible ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections executable location = /usr/bin/ansible-playbook python version = 3.9.25 (main, Jan 14 2026, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-11)] (/usr/bin/python3) jinja version = 3.1.2 libyaml = True Using /etc/ansible/ansible.cfg as config file Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_default.yml **************************************************** 1 plays in /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/tests_default.yml PLAY [Test] ******************************************************************** TASK [Gathering Facts] ********************************************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/tests_default.yml:2 ok: [managed-node01] TASK [Test | Run role analysis] ************************************************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/tests_default.yml:9 TASK [infra.leapp.common : Log directory exists] ******************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:3 changed: [managed-node01] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/var/log/ripu", "secontext": "unconfined_u:object_r:var_log_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : Check for existing log file] ************************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:11 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : Fail if log file already exists] ******************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:16 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : Create new log file] ******************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:23 NOTIFIED HANDLER infra.leapp.common : Check for log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Add end time to log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Slurp ripu.log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Decode ripu.log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Rename log file for managed-node01 changed: [managed-node01] => {"changed": true, "checksum": "1df6989c87e88d6ba797dbed0cf07c0312703e7a", "dest": "/var/log/ripu/ripu.log", "gid": 0, "group": "root", "md5sum": "3efed60c3d54f2a30bf84856ff0f631f", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 61, "src": "/root/.ansible/tmp/ansible-tmp-1771331187.6662717-5679-259565565848581/source", "state": "file", "uid": 0} TASK [infra.leapp.common : /etc/ansible/facts.d directory exists] ************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:35 changed: [managed-node01] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/ansible/facts.d", "secontext": "unconfined_u:object_r:etc_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : Capture current ansible_facts for validation after upgrade] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:43 changed: [managed-node01] => {"changed": true, "checksum": "0d34de248f22fbf122a7a74e445d4f9f0f2843f2", "dest": "/etc/ansible/facts.d/pre_ripu.fact", "gid": 0, "group": "root", "md5sum": "517aebb76fd5913641c9611a0a7397ec", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 11992, "src": "/root/.ansible/tmp/ansible-tmp-1771331188.5295734-5707-257249652902284/source", "state": "file", "uid": 0} TASK [infra.leapp.common : Capture a list of non-rhel versioned packages] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:51 ok: [managed-node01] => {"changed": false, "cmd": "set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\\.|+]el7' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort", "delta": "0:00:00.456020", "end": "2026-02-17 07:26:29.746425", "failed_when_result": false, "msg": "", "rc": 0, "start": "2026-02-17 07:26:29.290405", "stderr": "", "stderr_lines": [], "stdout": "epel-release-7-14.noarch\ntps-devel-2.44.50-1.noarch", "stdout_lines": ["epel-release-7-14.noarch", "tps-devel-2.44.50-1.noarch"]} TASK [infra.leapp.common : Create fact with the non-rhel versioned packages list] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:65 ok: [managed-node01] => {"ansible_facts": {"non_rhel_packages": ["epel-release-7-14.noarch", "tps-devel-2.44.50-1.noarch"]}, "changed": false} TASK [infra.leapp.common : Capture the list of non-rhel versioned packages in a separate fact file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:69 changed: [managed-node01] => {"changed": true, "checksum": "6d36b22d9c2b2f366fc090edfbac427c77d524a5", "dest": "/etc/ansible/facts.d/non_rhel_packages.fact", "gid": 0, "group": "root", "md5sum": "a7d4e8abcc28ebc36ca5401fee060144", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 58, "src": "/root/.ansible/tmp/ansible-tmp-1771331189.8450851-5737-143351151801779/source", "state": "file", "uid": 0} TASK [infra.leapp.analysis : Include tasks for preupg assistant analysis] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:9 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : Include tasks for leapp preupgrade analysis] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:13 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml for managed-node01 TASK [analysis-leapp | Register with Satellite activation key] ***************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:2 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [analysis-leapp | Include custom_local_repos for local_repos_pre_leapp] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:11 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : analysis-leapp | Install packages for preupgrade analysis on RHEL 7] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:19 changed: [managed-node01] => {"changed": true, "changes": {"installed": ["leapp-upgrade"], "updated": []}, "msg": "", "rc": 0, "results": ["Loaded plugins: search-disabled-repos\nResolving Dependencies\n--> Running transaction check\n---> Package leapp-upgrade-el7toel8.noarch 0:0.20.0-9.el7_9 will be installed\n--> Processing Dependency: leapp-repository-dependencies = 10 for package: leapp-upgrade-el7toel8-0.20.0-9.el7_9.noarch\n--> Processing Dependency: leapp-framework >= 5.0 for package: leapp-upgrade-el7toel8-0.20.0-9.el7_9.noarch\n--> Processing Dependency: python2-leapp for package: leapp-upgrade-el7toel8-0.20.0-9.el7_9.noarch\n--> Processing Dependency: leapp for package: leapp-upgrade-el7toel8-0.20.0-9.el7_9.noarch\n--> Running transaction check\n---> Package leapp.noarch 0:0.17.0-2.el7_9 will be installed\n---> Package leapp-upgrade-el7toel8-deps.noarch 0:0.20.0-9.el7_9 will be installed\n--> Processing Dependency: dnf >= 4 for package: leapp-upgrade-el7toel8-deps-0.20.0-9.el7_9.noarch\n---> Package python2-leapp.noarch 0:0.17.0-2.el7_9 will be installed\n--> Processing Dependency: leapp-framework-dependencies = 5 for package: python2-leapp-0.17.0-2.el7_9.noarch\n--> Running transaction check\n---> Package dnf.noarch 0:4.0.9.2-2.el7_9 will be installed\n--> Processing Dependency: python2-dnf = 4.0.9.2-2.el7_9 for package: dnf-4.0.9.2-2.el7_9.noarch\n---> Package leapp-deps.noarch 0:0.17.0-2.el7_9 will be installed\n--> Running transaction check\n---> Package python2-dnf.noarch 0:4.0.9.2-2.el7_9 will be installed\n--> Processing Dependency: dnf-data = 4.0.9.2-2.el7_9 for package: python2-dnf-4.0.9.2-2.el7_9.noarch\n--> Processing Dependency: python2-libdnf >= 0.22.5 for package: python2-dnf-4.0.9.2-2.el7_9.noarch\n--> Processing Dependency: python2-libcomps >= 0.1.8 for package: python2-dnf-4.0.9.2-2.el7_9.noarch\n--> Processing Dependency: python2-hawkey >= 0.22.5 for package: python2-dnf-4.0.9.2-2.el7_9.noarch\n--> Processing Dependency: libmodulemd >= 1.4.0 for package: python2-dnf-4.0.9.2-2.el7_9.noarch\n--> Processing Dependency: python2-libdnf for package: python2-dnf-4.0.9.2-2.el7_9.noarch\n--> Running transaction check\n---> Package dnf-data.noarch 0:4.0.9.2-2.el7_9 will be installed\n--> Processing Dependency: libreport-filesystem for package: dnf-data-4.0.9.2-2.el7_9.noarch\n---> Package libmodulemd.x86_64 0:1.6.3-1.el7 will be installed\n---> Package python2-hawkey.x86_64 0:0.22.5-2.el7_9 will be installed\n--> Processing Dependency: libdnf(x86-64) = 0.22.5-2.el7_9 for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libsolvext.so.0(SOLV_1.0)(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libsolv.so.0(SOLV_1.0)(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libsolvext.so.0()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libsolv.so.0()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: librhsm.so.0()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: librepo.so.0()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libjson-glib-1.0.so.0()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n--> Processing Dependency: libdnf.so.2()(64bit) for package: python2-hawkey-0.22.5-2.el7_9.x86_64\n---> Package python2-libcomps.x86_64 0:0.1.8-14.el7 will be installed\n--> Processing Dependency: libcomps(x86-64) = 0.1.8-14.el7 for package: python2-libcomps-0.1.8-14.el7.x86_64\n--> Processing Dependency: libcomps.so.0.1.6()(64bit) for package: python2-libcomps-0.1.8-14.el7.x86_64\n---> Package python2-libdnf.x86_64 0:0.22.5-2.el7_9 will be installed\n--> Running transaction check\n---> Package json-glib.x86_64 0:1.4.2-2.el7 will be installed\n---> Package libcomps.x86_64 0:0.1.8-14.el7 will be installed\n---> Package libdnf.x86_64 0:0.22.5-2.el7_9 will be installed\n---> Package librepo.x86_64 0:1.8.1-8.el7_9 will be installed\n---> Package libreport-filesystem.x86_64 0:2.1.11-53.el7 will be installed\n---> Package librhsm.x86_64 0:0.0.3-3.el7_9 will be installed\n---> Package libsolv.x86_64 0:0.6.34-4.el7 will be installed\n--> Finished Dependency Resolution\n\nDependencies Resolved\n\n================================================================================\n Package Arch Version Repository Size\n================================================================================\nInstalling:\n leapp-upgrade-el7toel8 noarch 0.20.0-9.el7_9 rhel-7-server-extras-rpms 1.2 M\nInstalling for dependencies:\n dnf noarch 4.0.9.2-2.el7_9 rhel-7-server-extras-rpms 357 k\n dnf-data noarch 4.0.9.2-2.el7_9 rhel-7-server-extras-rpms 51 k\n json-glib x86_64 1.4.2-2.el7 rhel 134 k\n leapp noarch 0.17.0-2.el7_9 rhel-7-server-extras-rpms 29 k\n leapp-deps noarch 0.17.0-2.el7_9 rhel-7-server-extras-rpms 12 k\n leapp-upgrade-el7toel8-deps\n noarch 0.20.0-9.el7_9 rhel-7-server-extras-rpms 37 k\n libcomps x86_64 0.1.8-14.el7 rhel-7-server-extras-rpms 75 k\n libdnf x86_64 0.22.5-2.el7_9 rhel-7-server-extras-rpms 536 k\n libmodulemd x86_64 1.6.3-1.el7 rhel-7-server-extras-rpms 153 k\n librepo x86_64 1.8.1-8.el7_9 rhel 82 k\n libreport-filesystem x86_64 2.1.11-53.el7 rhel 41 k\n librhsm x86_64 0.0.3-3.el7_9 rhel-7-server-extras-rpms 28 k\n libsolv x86_64 0.6.34-4.el7 rhel 329 k\n python2-dnf noarch 4.0.9.2-2.el7_9 rhel-7-server-extras-rpms 414 k\n python2-hawkey x86_64 0.22.5-2.el7_9 rhel-7-server-extras-rpms 71 k\n python2-leapp noarch 0.17.0-2.el7_9 rhel-7-server-extras-rpms 178 k\n python2-libcomps x86_64 0.1.8-14.el7 rhel-7-server-extras-rpms 47 k\n python2-libdnf x86_64 0.22.5-2.el7_9 rhel-7-server-extras-rpms 611 k\n\nTransaction Summary\n================================================================================\nInstall 1 Package (+18 Dependent packages)\n\nTotal download size: 4.3 M\nInstalled size: 21 M\nDownloading packages:\n--------------------------------------------------------------------------------\nTotal 8.4 MB/s | 4.3 MB 00:00 \nRunning transaction check\nRunning transaction test\nTransaction test succeeded\nRunning transaction\n Installing : json-glib-1.4.2-2.el7.x86_64 1/19 \n Installing : libmodulemd-1.6.3-1.el7.x86_64 2/19 \n Installing : librhsm-0.0.3-3.el7_9.x86_64 3/19 \n Installing : librepo-1.8.1-8.el7_9.x86_64 4/19 \n Installing : libsolv-0.6.34-4.el7.x86_64 5/19 \n Installing : libdnf-0.22.5-2.el7_9.x86_64 6/19 \n Installing : python2-libdnf-0.22.5-2.el7_9.x86_64 7/19 \n Installing : python2-hawkey-0.22.5-2.el7_9.x86_64 8/19 \n Installing : leapp-deps-0.17.0-2.el7_9.noarch 9/19 \n Installing : python2-leapp-0.17.0-2.el7_9.noarch 10/19 \n Installing : libcomps-0.1.8-14.el7.x86_64 11/19 \n Installing : python2-libcomps-0.1.8-14.el7.x86_64 12/19 \n Installing : libreport-filesystem-2.1.11-53.el7.x86_64 13/19 \n Installing : dnf-data-4.0.9.2-2.el7_9.noarch 14/19 \n Installing : python2-dnf-4.0.9.2-2.el7_9.noarch 15/19 \n Installing : dnf-4.0.9.2-2.el7_9.noarch 16/19 \n Installing : leapp-upgrade-el7toel8-deps-0.20.0-9.el7_9.noarch 17/19 \n Installing : leapp-0.17.0-2.el7_9.noarch 18/19 \n Installing : leapp-upgrade-el7toel8-0.20.0-9.el7_9.noarch 19/19 \n Verifying : libsolv-0.6.34-4.el7.x86_64 1/19 \n Verifying : librepo-1.8.1-8.el7_9.x86_64 2/19 \n Verifying : python2-libcomps-0.1.8-14.el7.x86_64 3/19 \n Verifying : dnf-4.0.9.2-2.el7_9.noarch 4/19 \n Verifying : leapp-upgrade-el7toel8-0.20.0-9.el7_9.noarch 5/19 \n Verifying : libdnf-0.22.5-2.el7_9.x86_64 6/19 \n Verifying : librhsm-0.0.3-3.el7_9.x86_64 7/19 \n Verifying : python2-leapp-0.17.0-2.el7_9.noarch 8/19 \n Verifying : python2-hawkey-0.22.5-2.el7_9.x86_64 9/19 \n Verifying : libmodulemd-1.6.3-1.el7.x86_64 10/19 \n Verifying : dnf-data-4.0.9.2-2.el7_9.noarch 11/19 \n Verifying : libreport-filesystem-2.1.11-53.el7.x86_64 12/19 \n Verifying : leapp-0.17.0-2.el7_9.noarch 13/19 \n Verifying : python2-dnf-4.0.9.2-2.el7_9.noarch 14/19 \n Verifying : leapp-upgrade-el7toel8-deps-0.20.0-9.el7_9.noarch 15/19 \n Verifying : json-glib-1.4.2-2.el7.x86_64 16/19 \n Verifying : python2-libdnf-0.22.5-2.el7_9.x86_64 17/19 \n Verifying : libcomps-0.1.8-14.el7.x86_64 18/19 \n Verifying : leapp-deps-0.17.0-2.el7_9.noarch 19/19 \n\nInstalled:\n leapp-upgrade-el7toel8.noarch 0:0.20.0-9.el7_9 \n\nDependency Installed:\n dnf.noarch 0:4.0.9.2-2.el7_9 \n dnf-data.noarch 0:4.0.9.2-2.el7_9 \n json-glib.x86_64 0:1.4.2-2.el7 \n leapp.noarch 0:0.17.0-2.el7_9 \n leapp-deps.noarch 0:0.17.0-2.el7_9 \n leapp-upgrade-el7toel8-deps.noarch 0:0.20.0-9.el7_9 \n libcomps.x86_64 0:0.1.8-14.el7 \n libdnf.x86_64 0:0.22.5-2.el7_9 \n libmodulemd.x86_64 0:1.6.3-1.el7 \n librepo.x86_64 0:1.8.1-8.el7_9 \n libreport-filesystem.x86_64 0:2.1.11-53.el7 \n librhsm.x86_64 0:0.0.3-3.el7_9 \n libsolv.x86_64 0:0.6.34-4.el7 \n python2-dnf.noarch 0:4.0.9.2-2.el7_9 \n python2-hawkey.x86_64 0:0.22.5-2.el7_9 \n python2-leapp.noarch 0:0.17.0-2.el7_9 \n python2-libcomps.x86_64 0:0.1.8-14.el7 \n python2-libdnf.x86_64 0:0.22.5-2.el7_9 \n\nComplete!\n"]} TASK [infra.leapp.analysis : analysis-leapp | Install packages for preupgrade analysis on RHEL 8] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:26 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : analysis-leapp | Install packages for preupgrade analysis on RHEL 9] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:33 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : analysis-leapp | Ensure leapp log directory exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:40 ok: [managed-node01] => {"changed": false, "gid": 0, "group": "root", "mode": "0700", "owner": "root", "path": "/var/log/leapp", "secontext": "system_u:object_r:var_log_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.analysis : analysis-leapp | Populate leapp_answers file] ***** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:48 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [analysis-leapp | Create /etc/leapp/files/leapp_upgrade_repositories.repo] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:57 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : analysis-leapp | Leapp preupgrade report] ********* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:68 ASYNC FAILED on managed-node01: jid=j849430504225.4267 changed: [managed-node01] => {"ansible_job_id": "j849430504225.4267", "changed": true, "cmd": "set -o pipefail; export PATH=$PATH; ulimit -n 16384; leapp preupgrade --report-schema=1.2.0 2>&1 | tee -a /var/log/ripu/ripu.log\n", "delta": "0:00:40.392694", "end": "2026-02-17 07:27:18.920831", "failed_when_result": false, "finished": 1, "msg": "non-zero return code", "rc": 1, "results_file": "/root/.ansible_async/j849430504225.4267", "start": "2026-02-17 07:26:38.528137", "started": 1, "stderr": "", "stderr_lines": [], "stdout": "==> Processing phase `configuration_phase`\n====> * ipu_workflow_config\n IPU workflow config actor\n==> Processing phase `FactsCollection`\n====> * firewalld_facts_actor\n Provide data about firewalld\n====> * load_device_driver_deprecation_data\n Loads deprecation data for drivers and devices (PCI & CPU)\n====> * scan_kernel_cmdline\n No documentation has been provided for the scan_kernel_cmdline actor.\n====> * scan_systemd_source\n Provides info about systemd on the source system\n====> * udevadm_info\n Produces data exported by the \"udevadm info\" command.\n====> * get_enabled_modules\n Provides data about which module streams are enabled on the source system.\n====> * persistentnetnames\n Get network interface information for physical ethernet interfaces of the original system.\n====> * scan_grub_device_name\n Find the name of the block devices where GRUB is located\n====> * scanmemory\n Scan Memory of the machine.\n====> * persistentnetnamesdisable\n Disable systemd-udevd persistent network naming on machine with single eth0 NIC\n====> * scan_target_os_image\n Scans the provided target OS ISO image to use as a content source for the IPU, if any.\n====> * common_leapp_dracut_modules\n Influences the generation of the initram disk\n====> * transaction_workarounds\n Provides additional RPM transaction tasks based on bundled RPM packages.\n====> * scan_source_files\n Scan files (explicitly specified) of the source system.\n====> * check_grub_legacy\n Check whether GRUB Legacy is installed in the MBR.\n====> * sssd_facts\n Check SSSD configuration for changes in RHEL8 and report them in model.\n====> * scanzfcp\n In case of s390x architecture, check whether ZFCP is used.\n====> * scan_subscription_manager_info\n Scans the current system for subscription manager information\n====> * storage_scanner\n Provides data about storage settings.\n====> * scandasd\n In case of s390x architecture, check whether DASD is used.\n====> * scancpu\n Scan CPUs of the machine.\n====> * register_yum_adjustment\n Registers a workaround which will adjust the yum directories during the upgrade.\n====> * system_facts\n Provides data about many facts from system.\n====> * scan_pkg_manager\n Provides data about package manager (yum/dnf)\n====> * selinuxcontentscanner\n Scan the system for any SELinux customizations\n====> * source_boot_loader_scanner\n Scans the boot loader configuration on the source system.\n====> * pci_devices_scanner\n Provides data about existing PCI Devices.\n====> * repository_mapping\n Produces message containing repository mapping based on provided file.\n====> * root_scanner\n Scan the system root directory and produce a message containing\n====> * scanclienablerepo\n Produce CustomTargetRepository based on the LEAPP_ENABLE_REPOS in config.\n====> * rpm_scanner\n Provides data about installed RPM Packages.\nLoaded plugins: product-id, subscription-manager\n\nThis system is not registered with an entitlement server. You can use subscription-manager to register.\n\n====> * scan_grub_config\n Scan grub configuration files for errors.\n====> * read_openssh_config\n Collect information about the OpenSSH configuration.\n====> * scan_files_for_target_userspace\n Scan the source system and identify files that will be copied into the target userspace when it is created.\n====> * pam_modules_scanner\n Scan the pam directory for services and modules used in them\n====> * network_manager_read_config\n Provides data about NetworkManager configuration.\n====> * scan_custom_modifications_actor\n Collects information about files in leapp directories that have been modified or newly added.\n====> * scan_custom_repofile\n Scan the custom /etc/leapp/files/leapp_upgrade_repositories.repo repo file.\n====> * distribution_signed_rpm_scanner\n Provide data about distribution signed & unsigned RPM packages.\n====> * scan_sap_hana\n Gathers information related to SAP HANA instances on the system.\n====> * scan_grub_device_partition_layout\n Scan all identified GRUB devices for their partition layout.\n====> * authselect_scanner\n Detect what authselect configuration should be suggested to administrator.\n====> * tcp_wrappers_config_read\n Parse tcp_wrappers configuration files /etc/hosts.{allow,deny}.\n====> * copy_dnf_conf_into_target_userspace\n Copy dnf.conf into target userspace\n====> * removed_pam_modules_scanner\n Scan PAM configuration for modules that are not available in RHEL-8.\n====> * biosdevname\n Enable biosdevname on the target RHEL system if all interfaces on the source RHEL\n====> * scan_fips\n Determine whether the source system has FIPS enabled.\n====> * used_repository_scanner\n Scan used enabled repositories\n====> * ipa_scanner\n Scan system for ipa-client and ipa-server status\n====> * satellite_upgrade_facts\n Report which Satellite packages require updates and how to handle PostgreSQL data\n====> * detect_kernel_drivers\n Matches all currently loaded kernel drivers against known deprecated and removed drivers.\n====> * xfs_info_scanner\n This actor scans all mounted mountpoints for XFS information\n====> * check_kde_apps\n Actor checks which KDE apps are installed.\n====> * multipath_conf_read\n Read multipath configuration files and extract the necessary information\n====> * checkrhui\n Check if system is using RHUI infrastructure (on public cloud) and send messages to\n====> * sctp_read_status\n Determines whether or not the SCTP kernel module might be wanted.\n====> * get_installed_desktops\n Actor checks if kde or gnome desktop environments\n====> * trusted_gpg_keys_scanner\n Scan for trusted GPG keys.\n====> * spamassassin_config_read\n Reads spamc configuration (/etc/mail/spamassassin/spamc.conf), the\n====> * vsftpd_config_read\n Reads vsftpd configuration files (/etc/vsftpd/*.conf) and extracts necessary information.\n====> * remove_obsolete_gpg_keys\n Remove obsoleted RPM GPG keys.\n====> * scan_source_kernel\n Scan the source system kernel.\n====> * repositories_blacklist\n Exclude target repositories provided by Red Hat without support.\n====> * quagga_daemons\n Active quagga daemons check.\n====> * rpm_transaction_config_tasks_collector\n Provides additional RPM transaction tasks from /etc/leapp/transaction.\n====> * scan_dynamic_linker_configuration\n Scan the dynamic linker configuration and find modifications.\n====> * cups_scanner\n Gather facts about CUPS features which needs to be migrated\n====> * pes_events_scanner\n Provides data about package events from Package Evolution Service.\n====> * setuptargetrepos\n Produces list of repositories that should be available to be used by Upgrade process.\n\n============================================================\n ERRORS \n============================================================\n\n2026-02-17 07:26:53.215648 [ERROR] Actor: scan_subscription_manager_info\nMessage: A subscription-manager command failed to execute\nSummary:\n Link: https://access.redhat.com/solutions/6138372\n Details: Command ['subscription-manager', 'release'] failed with exit code 1.\n Stderr: This system is not yet registered. Try 'subscription-manager register --help' for more information.\n Hint: Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\n\n============================================================\n END OF ERRORS \n============================================================\n\nDebug output written to /var/log/leapp/leapp-preupgrade.log\n\n============================================================\n REPORT OVERVIEW \n============================================================\n\nFollowing errors occurred and the upgrade cannot continue:\n 1. Actor: scan_subscription_manager_info\n Message: A subscription-manager command failed to execute\n\nHIGH and MEDIUM severity reports:\n 1. Packages available in excluded repositories will not be installed\n\nReports summary:\n Errors: 1\n Inhibitors: 0\n HIGH severity reports: 1\n MEDIUM severity reports: 0\n LOW severity reports: 0\n INFO severity reports: 1\n\nBefore continuing, review the full report below for details about discovered problems and possible remediation instructions:\n A report has been generated at /var/log/leapp/leapp-report.txt\n A report has been generated at /var/log/leapp/leapp-report.json\n\n============================================================\n END OF REPORT OVERVIEW \n============================================================\n\nAnswerfile has been generated at /var/log/leapp/answerfile", "stdout_lines": ["==> Processing phase `configuration_phase`", "====> * ipu_workflow_config", " IPU workflow config actor", "==> Processing phase `FactsCollection`", "====> * firewalld_facts_actor", " Provide data about firewalld", "====> * load_device_driver_deprecation_data", " Loads deprecation data for drivers and devices (PCI & CPU)", "====> * scan_kernel_cmdline", " No documentation has been provided for the scan_kernel_cmdline actor.", "====> * scan_systemd_source", " Provides info about systemd on the source system", "====> * udevadm_info", " Produces data exported by the \"udevadm info\" command.", "====> * get_enabled_modules", " Provides data about which module streams are enabled on the source system.", "====> * persistentnetnames", " Get network interface information for physical ethernet interfaces of the original system.", "====> * scan_grub_device_name", " Find the name of the block devices where GRUB is located", "====> * scanmemory", " Scan Memory of the machine.", "====> * persistentnetnamesdisable", " Disable systemd-udevd persistent network naming on machine with single eth0 NIC", "====> * scan_target_os_image", " Scans the provided target OS ISO image to use as a content source for the IPU, if any.", "====> * common_leapp_dracut_modules", " Influences the generation of the initram disk", "====> * transaction_workarounds", " Provides additional RPM transaction tasks based on bundled RPM packages.", "====> * scan_source_files", " Scan files (explicitly specified) of the source system.", "====> * check_grub_legacy", " Check whether GRUB Legacy is installed in the MBR.", "====> * sssd_facts", " Check SSSD configuration for changes in RHEL8 and report them in model.", "====> * scanzfcp", " In case of s390x architecture, check whether ZFCP is used.", "====> * scan_subscription_manager_info", " Scans the current system for subscription manager information", "====> * storage_scanner", " Provides data about storage settings.", "====> * scandasd", " In case of s390x architecture, check whether DASD is used.", "====> * scancpu", " Scan CPUs of the machine.", "====> * register_yum_adjustment", " Registers a workaround which will adjust the yum directories during the upgrade.", "====> * system_facts", " Provides data about many facts from system.", "====> * scan_pkg_manager", " Provides data about package manager (yum/dnf)", "====> * selinuxcontentscanner", " Scan the system for any SELinux customizations", "====> * source_boot_loader_scanner", " Scans the boot loader configuration on the source system.", "====> * pci_devices_scanner", " Provides data about existing PCI Devices.", "====> * repository_mapping", " Produces message containing repository mapping based on provided file.", "====> * root_scanner", " Scan the system root directory and produce a message containing", "====> * scanclienablerepo", " Produce CustomTargetRepository based on the LEAPP_ENABLE_REPOS in config.", "====> * rpm_scanner", " Provides data about installed RPM Packages.", "Loaded plugins: product-id, subscription-manager", "", "This system is not registered with an entitlement server. You can use subscription-manager to register.", "", "====> * scan_grub_config", " Scan grub configuration files for errors.", "====> * read_openssh_config", " Collect information about the OpenSSH configuration.", "====> * scan_files_for_target_userspace", " Scan the source system and identify files that will be copied into the target userspace when it is created.", "====> * pam_modules_scanner", " Scan the pam directory for services and modules used in them", "====> * network_manager_read_config", " Provides data about NetworkManager configuration.", "====> * scan_custom_modifications_actor", " Collects information about files in leapp directories that have been modified or newly added.", "====> * scan_custom_repofile", " Scan the custom /etc/leapp/files/leapp_upgrade_repositories.repo repo file.", "====> * distribution_signed_rpm_scanner", " Provide data about distribution signed & unsigned RPM packages.", "====> * scan_sap_hana", " Gathers information related to SAP HANA instances on the system.", "====> * scan_grub_device_partition_layout", " Scan all identified GRUB devices for their partition layout.", "====> * authselect_scanner", " Detect what authselect configuration should be suggested to administrator.", "====> * tcp_wrappers_config_read", " Parse tcp_wrappers configuration files /etc/hosts.{allow,deny}.", "====> * copy_dnf_conf_into_target_userspace", " Copy dnf.conf into target userspace", "====> * removed_pam_modules_scanner", " Scan PAM configuration for modules that are not available in RHEL-8.", "====> * biosdevname", " Enable biosdevname on the target RHEL system if all interfaces on the source RHEL", "====> * scan_fips", " Determine whether the source system has FIPS enabled.", "====> * used_repository_scanner", " Scan used enabled repositories", "====> * ipa_scanner", " Scan system for ipa-client and ipa-server status", "====> * satellite_upgrade_facts", " Report which Satellite packages require updates and how to handle PostgreSQL data", "====> * detect_kernel_drivers", " Matches all currently loaded kernel drivers against known deprecated and removed drivers.", "====> * xfs_info_scanner", " This actor scans all mounted mountpoints for XFS information", "====> * check_kde_apps", " Actor checks which KDE apps are installed.", "====> * multipath_conf_read", " Read multipath configuration files and extract the necessary information", "====> * checkrhui", " Check if system is using RHUI infrastructure (on public cloud) and send messages to", "====> * sctp_read_status", " Determines whether or not the SCTP kernel module might be wanted.", "====> * get_installed_desktops", " Actor checks if kde or gnome desktop environments", "====> * trusted_gpg_keys_scanner", " Scan for trusted GPG keys.", "====> * spamassassin_config_read", " Reads spamc configuration (/etc/mail/spamassassin/spamc.conf), the", "====> * vsftpd_config_read", " Reads vsftpd configuration files (/etc/vsftpd/*.conf) and extracts necessary information.", "====> * remove_obsolete_gpg_keys", " Remove obsoleted RPM GPG keys.", "====> * scan_source_kernel", " Scan the source system kernel.", "====> * repositories_blacklist", " Exclude target repositories provided by Red Hat without support.", "====> * quagga_daemons", " Active quagga daemons check.", "====> * rpm_transaction_config_tasks_collector", " Provides additional RPM transaction tasks from /etc/leapp/transaction.", "====> * scan_dynamic_linker_configuration", " Scan the dynamic linker configuration and find modifications.", "====> * cups_scanner", " Gather facts about CUPS features which needs to be migrated", "====> * pes_events_scanner", " Provides data about package events from Package Evolution Service.", "====> * setuptargetrepos", " Produces list of repositories that should be available to be used by Upgrade process.", "", "============================================================", " ERRORS ", "============================================================", "", "2026-02-17 07:26:53.215648 [ERROR] Actor: scan_subscription_manager_info", "Message: A subscription-manager command failed to execute", "Summary:", " Link: https://access.redhat.com/solutions/6138372", " Details: Command ['subscription-manager', 'release'] failed with exit code 1.", " Stderr: This system is not yet registered. Try 'subscription-manager register --help' for more information.", " Hint: Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.", "", "============================================================", " END OF ERRORS ", "============================================================", "", "Debug output written to /var/log/leapp/leapp-preupgrade.log", "", "============================================================", " REPORT OVERVIEW ", "============================================================", "", "Following errors occurred and the upgrade cannot continue:", " 1. Actor: scan_subscription_manager_info", " Message: A subscription-manager command failed to execute", "", "HIGH and MEDIUM severity reports:", " 1. Packages available in excluded repositories will not be installed", "", "Reports summary:", " Errors: 1", " Inhibitors: 0", " HIGH severity reports: 1", " MEDIUM severity reports: 0", " LOW severity reports: 0", " INFO severity reports: 1", "", "Before continuing, review the full report below for details about discovered problems and possible remediation instructions:", " A report has been generated at /var/log/leapp/leapp-report.txt", " A report has been generated at /var/log/leapp/leapp-report.json", "", "============================================================", " END OF REPORT OVERVIEW ", "============================================================", "", "Answerfile has been generated at /var/log/leapp/answerfile"]} TASK [analysis-leapp | Include custom_local_repos for local_repos_post_analysis] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:86 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [analysis-leapp | Restore original Satellite activation key] ************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:96 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : analysis-leapp | Include check-results-file.yml] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:107 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/check-results-file.yml for managed-node01 TASK [infra.leapp.analysis : check-results-file | Result file status] ********** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/check-results-file.yml:2 ok: [managed-node01] => {"changed": false, "stat": {"atime": 1771331238.49706, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e6f63114c5658544b255fcd8e414882006c9fc7b", "ctime": 1771331238.49706, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 360710225, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1771331238.49706, "nlink": 1, "path": "/var/log/leapp/leapp-report.txt", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 2804, "uid": 0, "version": "18446744071844160382", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}} TASK [infra.leapp.analysis : check-results-file | Check that result file exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/check-results-file.yml:7 ok: [managed-node01] => { "changed": false, "msg": "All assertions passed" } TASK [analysis-leapp | Run parse_leapp_report to check for inhibitors] ********* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:110 TASK [infra.leapp.common : parse_leapp_report | Default upgrade_inhibited to false] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/parse_leapp_report.yml:12 ok: [managed-node01] => {"ansible_facts": {"upgrade_inhibited": false}, "changed": false} TASK [infra.leapp.common : parse_leapp_report | Collect human readable report results] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/parse_leapp_report.yml:16 ok: [managed-node01] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false} TASK [infra.leapp.common : parse_leapp_report | Collect JSON report results] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/parse_leapp_report.yml:22 ok: [managed-node01] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false} TASK [infra.leapp.common : parse_leapp_report | Parse report results] ********** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/parse_leapp_report.yml:28 ok: [managed-node01] => {"censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false} TASK [infra.leapp.common : parse_leapp_report | Clear leapp_inhibitors] ******** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/parse_leapp_report.yml:35 ok: [managed-node01] => {"ansible_facts": {"leapp_inhibitors": []}, "changed": false} TASK [infra.leapp.common : parse_leapp_report | Check for inhibitors] ********** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/parse_leapp_report.yml:39 ok: [managed-node01] => (item={'groups': ['error'], 'title': 'A subscription-manager command failed to execute', 'timeStamp': '2026-02-17T12:26:53.215835Z', 'hostname': 'managed-node01', 'actor': 'scan_subscription_manager_info', 'summary': '{"link": "https://access.redhat.com/solutions/6138372", "details": "Command [\'subscription-manager\', \'release\'] failed with exit code 1.", "stderr": "This system is not yet registered. Try \'subscription-manager register --help\' for more information.\\n", "hint": "Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud."}', 'audience': 'sysadmin', 'key': '7ec8269784db1bba2ac54ae438689ef397e16833', 'id': '963e118b4a373b8fd412ec64b96dbd9516450bc8cd46792a3bc2256849e01ef7', 'severity': 'high'}) => {"ansible_facts": {"leapp_inhibitors": [{"actor": "scan_subscription_manager_info", "audience": "sysadmin", "groups": ["error"], "hostname": "managed-node01", "id": "963e118b4a373b8fd412ec64b96dbd9516450bc8cd46792a3bc2256849e01ef7", "key": "7ec8269784db1bba2ac54ae438689ef397e16833", "severity": "high", "summary": "{\"link\": \"https://access.redhat.com/solutions/6138372\", \"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\"}", "timeStamp": "2026-02-17T12:26:53.215835Z", "title": "A subscription-manager command failed to execute"}], "upgrade_inhibited": true}, "ansible_loop_var": "item", "changed": false, "item": {"actor": "scan_subscription_manager_info", "audience": "sysadmin", "groups": ["error"], "hostname": "managed-node01", "id": "963e118b4a373b8fd412ec64b96dbd9516450bc8cd46792a3bc2256849e01ef7", "key": "7ec8269784db1bba2ac54ae438689ef397e16833", "severity": "high", "summary": "{\"link\": \"https://access.redhat.com/solutions/6138372\", \"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\"}", "timeStamp": "2026-02-17T12:26:53.215835Z", "title": "A subscription-manager command failed to execute"}} skipping: [managed-node01] => (item={'groups': ['repository', 'failure'], 'title': 'Excluded target system repositories', 'timeStamp': '2026-02-17T12:27:16.347523Z', 'hostname': 'managed-node01', 'detail': {'remediations': [{'type': 'hint', 'context': 'If some of excluded repositories are still required to be used during the upgrade, execute leapp with the --enablerepo option with the repoid of the repository required to be enabled as an argument (the option can be used multiple times).'}]}, 'actor': 'repositories_blacklist', 'summary': 'The following repositories are not supported by Red Hat and are excluded from the list of repositories used during the upgrade.\n- codeready-builder-beta-for-rhel-8-s390x-rpms\n- codeready-builder-beta-for-rhel-8-ppc64le-rpms\n- rhui-codeready-builder-for-rhel-8-x86_64-rhui-rpms\n- codeready-builder-for-rhel-8-aarch64-eus-rpms\n- codeready-builder-for-rhel-8-ppc64le-eus-rpms\n- codeready-builder-beta-for-rhel-8-x86_64-rpms\n- codeready-builder-for-rhel-8-aarch64-rpms\n- codeready-builder-for-rhel-8-s390x-rpms\n- codeready-builder-for-rhel-8-s390x-eus-rpms\n- codeready-builder-for-rhel-8-x86_64-eus-rpms\n- rhui-codeready-builder-for-rhel-8-aarch64-rhui-rpms\n- codeready-builder-beta-for-rhel-8-aarch64-rpms\n- codeready-builder-for-rhel-8-rhui-rpms\n- codeready-builder-for-rhel-8-x86_64-rhui-rpms\n- codeready-builder-for-rhel-8-x86_64-rpms\n- codeready-builder-for-rhel-8-x86_64-eus-rhui-rpms\n- codeready-builder-for-rhel-8-ppc64le-rpms', 'audience': 'sysadmin', 'key': '1b9132cb2362ae7830e48eee7811be9527747de8', 'id': '0df6530b1203079dac25db7abcd5538546383f0fa07c018e72e01a2178897925', 'severity': 'info'}) => {"ansible_loop_var": "item", "changed": false, "item": {"actor": "repositories_blacklist", "audience": "sysadmin", "detail": {"remediations": [{"context": "If some of excluded repositories are still required to be used during the upgrade, execute leapp with the --enablerepo option with the repoid of the repository required to be enabled as an argument (the option can be used multiple times).", "type": "hint"}]}, "groups": ["repository", "failure"], "hostname": "managed-node01", "id": "0df6530b1203079dac25db7abcd5538546383f0fa07c018e72e01a2178897925", "key": "1b9132cb2362ae7830e48eee7811be9527747de8", "severity": "info", "summary": "The following repositories are not supported by Red Hat and are excluded from the list of repositories used during the upgrade.\n- codeready-builder-beta-for-rhel-8-s390x-rpms\n- codeready-builder-beta-for-rhel-8-ppc64le-rpms\n- rhui-codeready-builder-for-rhel-8-x86_64-rhui-rpms\n- codeready-builder-for-rhel-8-aarch64-eus-rpms\n- codeready-builder-for-rhel-8-ppc64le-eus-rpms\n- codeready-builder-beta-for-rhel-8-x86_64-rpms\n- codeready-builder-for-rhel-8-aarch64-rpms\n- codeready-builder-for-rhel-8-s390x-rpms\n- codeready-builder-for-rhel-8-s390x-eus-rpms\n- codeready-builder-for-rhel-8-x86_64-eus-rpms\n- rhui-codeready-builder-for-rhel-8-aarch64-rhui-rpms\n- codeready-builder-beta-for-rhel-8-aarch64-rpms\n- codeready-builder-for-rhel-8-rhui-rpms\n- codeready-builder-for-rhel-8-x86_64-rhui-rpms\n- codeready-builder-for-rhel-8-x86_64-rpms\n- codeready-builder-for-rhel-8-x86_64-eus-rhui-rpms\n- codeready-builder-for-rhel-8-ppc64le-rpms", "timeStamp": "2026-02-17T12:27:16.347523Z", "title": "Excluded target system repositories"}, "skip_reason": "Conditional result was False"} skipping: [managed-node01] => (item={'groups': ['repository'], 'title': 'Packages available in excluded repositories will not be installed', 'timeStamp': '2026-02-17T12:27:18.337919Z', 'hostname': 'managed-node01', 'detail': {'related_resources': [{'scheme': 'package', 'title': 'python3-pyxattr'}, {'scheme': 'package', 'title': 'rpcgen'}]}, 'actor': 'pes_events_scanner', 'summary': '2 packages will be skipped because they are available only in target system repositories that are intentionally excluded from the list of repositories used during the upgrade. See the report message titled "Excluded target system repositories" for details.\nThe list of these packages:\n- python3-pyxattr (repoid: codeready-builder-for-rhel-8-x86_64-rpms)\n- rpcgen (repoid: codeready-builder-for-rhel-8-x86_64-rpms)', 'audience': 'sysadmin', 'key': '2437e204808f987477c0e9be8e4c95b3a87a9f3e', 'id': '7abd27f7e0e234d7a4ee6e94f5bfdd21cc82d9b384c4a1692e29f5c18974b53f', 'severity': 'high'}) => {"ansible_loop_var": "item", "changed": false, "item": {"actor": "pes_events_scanner", "audience": "sysadmin", "detail": {"related_resources": [{"scheme": "package", "title": "python3-pyxattr"}, {"scheme": "package", "title": "rpcgen"}]}, "groups": ["repository"], "hostname": "managed-node01", "id": "7abd27f7e0e234d7a4ee6e94f5bfdd21cc82d9b384c4a1692e29f5c18974b53f", "key": "2437e204808f987477c0e9be8e4c95b3a87a9f3e", "severity": "high", "summary": "2 packages will be skipped because they are available only in target system repositories that are intentionally excluded from the list of repositories used during the upgrade. See the report message titled \"Excluded target system repositories\" for details.\nThe list of these packages:\n- python3-pyxattr (repoid: codeready-builder-for-rhel-8-x86_64-rpms)\n- rpcgen (repoid: codeready-builder-for-rhel-8-x86_64-rpms)", "timeStamp": "2026-02-17T12:27:18.337919Z", "title": "Packages available in excluded repositories will not be installed"}, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : parse_leapp_report | Collect inhibitors] ************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/parse_leapp_report.yml:51 ok: [managed-node01] => {"changed": false, "cmd": ["awk", "/\\(inhibitor\\)/,/^-------/", "/var/log/leapp/leapp-report.txt"], "delta": "0:00:00.003271", "end": "2026-02-17 07:27:40.765595", "failed_when_result": false, "msg": "", "rc": 0, "start": "2026-02-17 07:27:40.762324", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [infra.leapp.common : parse_leapp_report | Collect high errors] *********** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/parse_leapp_report.yml:60 ok: [managed-node01] => {"changed": false, "cmd": ["awk", "/high \\(error\\)/,/^-------/", "/var/log/leapp/leapp-report.txt"], "delta": "0:00:00.003947", "end": "2026-02-17 07:27:41.023948", "failed_when_result": false, "msg": "", "rc": 0, "start": "2026-02-17 07:27:41.020001", "stderr": "", "stderr_lines": [], "stdout": "Risk Factor: high (error)\nTitle: A subscription-manager command failed to execute\nSummary: {\"link\": \"https://access.redhat.com/solutions/6138372\", \"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\"}\nKey: 7ec8269784db1bba2ac54ae438689ef397e16833\n----------------------------------------", "stdout_lines": ["Risk Factor: high (error)", "Title: A subscription-manager command failed to execute", "Summary: {\"link\": \"https://access.redhat.com/solutions/6138372\", \"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\"}", "Key: 7ec8269784db1bba2ac54ae438689ef397e16833", "----------------------------------------"]} TASK [infra.leapp.analysis : analysis-leapp | Copy reports to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:115 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/copy_reports_to_controller.yml for managed-node01 TASK [infra.leapp.analysis : copy_reports_to_controller | Ensure reports directory on controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/copy_reports_to_controller.yml:10 changed: [managed-node01 -> localhost] => {"changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/leapp_reports_2026-02-17_12-27-41", "secontext": "unconfined_u:object_r:admin_home_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.analysis : copy_reports_to_controller | Copy report files to the controller] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/copy_reports_to_controller.yml:18 changed: [managed-node01] => (item={'src': '/var/log/leapp/leapp-report.json', 'ext': 'json'}) => {"ansible_loop_var": "item", "changed": true, "checksum": "594a928b3a3c38d740ef947584299e870af6c4bf", "dest": "/root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/leapp_reports_2026-02-17_12-27-41/leapp-report.managed-node01.json/managed-node01/var/log/leapp/leapp-report.json", "item": {"ext": "json", "src": "/var/log/leapp/leapp-report.json"}, "md5sum": "c9cc66d11ed2914dddd2e8f8736576a6", "remote_checksum": "594a928b3a3c38d740ef947584299e870af6c4bf", "remote_md5sum": null} changed: [managed-node01] => (item={'src': '/var/log/leapp/leapp-report.txt', 'ext': 'txt'}) => {"ansible_loop_var": "item", "changed": true, "checksum": "e6f63114c5658544b255fcd8e414882006c9fc7b", "dest": "/root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/leapp_reports_2026-02-17_12-27-41/leapp-report.managed-node01.txt/managed-node01/var/log/leapp/leapp-report.txt", "item": {"ext": "txt", "src": "/var/log/leapp/leapp-report.txt"}, "md5sum": "47b4aadcf297f51ca74bd072e570f11d", "remote_checksum": "e6f63114c5658544b255fcd8e414882006c9fc7b", "remote_md5sum": null} TASK [infra.leapp.analysis : analysis-leapp | Create remediation hostvars] ***** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:119 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/create_remediation_hostvars.yml for managed-node01 TASK [infra.leapp.analysis : create_remediation_hostvars | Clear remediation_todo] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/create_remediation_hostvars.yml:2 ok: [managed-node01] => {"ansible_facts": {"leapp_remediation_todo": []}, "changed": false} TASK [infra.leapp.analysis : create_remediation_hostvars | Map inhibitors to remediation_todo] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/create_remediation_hostvars.yml:6 failed: [managed-node01] (item=None) => {"ansible_loop_var": "inhibitor", "changed": false, "inhibitor": {"actor": "scan_subscription_manager_info", "audience": "sysadmin", "groups": ["error"], "hostname": "managed-node01", "id": "963e118b4a373b8fd412ec64b96dbd9516450bc8cd46792a3bc2256849e01ef7", "key": "7ec8269784db1bba2ac54ae438689ef397e16833", "severity": "high", "summary": "{\"link\": \"https://access.redhat.com/solutions/6138372\", \"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\"}", "timeStamp": "2026-02-17T12:26:53.215835Z", "title": "A subscription-manager command failed to execute"}, "msg": "Failed to template loop_control.label: {{ __leapp_inhibitors_key_map[inhibitor_key]\n if inhibitor_key in __leapp_inhibitors_key_map\n else __leapp_inhibitors_title_map[inhibitor_title] }}: 'dict object' has no attribute 'A subscription-manager command failed to execute'. 'dict object' has no attribute 'A subscription-manager command failed to execute'. {{ __leapp_inhibitors_key_map[inhibitor_key]\n if inhibitor_key in __leapp_inhibitors_key_map\n else __leapp_inhibitors_title_map[inhibitor_title] }}: 'dict object' has no attribute 'A subscription-manager command failed to execute'. 'dict object' has no attribute 'A subscription-manager command failed to execute'", "skip_reason": "Conditional result was False"} TASK [Cleanup | Remove log files] ********************************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/tests_default.yml:13 changed: [managed-node01] => {"changed": true, "cmd": "set -euxo pipefail\nrm -f /var/log/leapp/leapp-upgrade.log\nrm -f /var/log/ripu/ripu.log*\n", "delta": "0:00:00.004564", "end": "2026-02-17 07:27:42.221217", "msg": "", "rc": 0, "start": "2026-02-17 07:27:42.216653", "stderr": "+ rm -f /var/log/leapp/leapp-upgrade.log\n+ rm -f /var/log/ripu/ripu.log", "stderr_lines": ["+ rm -f /var/log/leapp/leapp-upgrade.log", "+ rm -f /var/log/ripu/ripu.log"], "stdout": "", "stdout_lines": []} PLAY RECAP ********************************************************************* managed-node01 : ok=30 changed=10 unreachable=0 failed=1 skipped=10 rescued=0 ignored=0 -- Logs begin at Tue 2026-02-17 07:23:37 EST, end at Tue 2026-02-17 07:27:42 EST. -- Feb 17 07:26:25 managed-node01 sshd[3338]: Accepted publickey for root from 10.31.8.42 port 52688 ssh2: ECDSA SHA256:D3axHAvFbwi9dHfRTrLRFOkIDtPvUdC3d0Gx6RR6uB0 Feb 17 07:26:25 managed-node01 systemd[1]: Started Session 6 of user root. -- Subject: Unit session-6.scope has finished start-up -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- -- Unit session-6.scope has finished starting up. -- -- The start-up result is done. Feb 17 07:26:25 managed-node01 systemd-logind[537]: New session 6 of user root. -- Subject: A new session 6 has been created for user root -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- Documentation: http://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 6 has been created for the user root. -- -- The leading process of the session is 3338. Feb 17 07:26:25 managed-node01 sshd[3338]: pam_unix(sshd:session): session opened for user root by (uid=0) Feb 17 07:26:26 managed-node01 ansible-ansible.legacy.setup[3409]: Invoked with filter=[] gather_subset=['all'] fact_path=/etc/ansible/facts.d gather_timeout=10 Feb 17 07:26:27 managed-node01 ansible-ansible.builtin.file[3500]: Invoked with src=None selevel=None force=False setype=None _original_basename=None unsafe_writes=False access_time=None seuser=None recurse=False state=directory access_time_format=%Y%m%d%H%M.%S group=root modification_time=None serole=None _diff_peek=None modification_time_format=%Y%m%d%H%M.%S path=/var/log/ripu owner=root follow=True attributes=None mode=0755 Feb 17 07:26:27 managed-node01 ansible-ansible.builtin.stat[3561]: Invoked with checksum_algorithm=sha1 get_checksum=True follow=False path=/var/log/ripu/ripu.log get_md5=False get_mime=True get_attributes=True Feb 17 07:26:27 managed-node01 ansible-ansible.legacy.stat[3622]: Invoked with checksum_algorithm=sha1 get_checksum=True path=/var/log/ripu/ripu.log follow=False get_md5=False get_mime=True get_attributes=True Feb 17 07:26:28 managed-node01 ansible-ansible.legacy.copy[3668]: Invoked with src=/root/.ansible/tmp/ansible-tmp-1771331187.6662717-5679-259565565848581/source directory_mode=None force=True attributes=None remote_src=None unsafe_writes=False dest=/var/log/ripu/ripu.log seuser=None setype=None group=root content=NOT_LOGGING_PARAMETER _original_basename=tmpxnlrawhe serole=None mode=0644 selevel=None owner=root follow=False validate=None checksum=1df6989c87e88d6ba797dbed0cf07c0312703e7a backup=False local_follow=None Feb 17 07:26:28 managed-node01 ansible-ansible.builtin.file[3729]: Invoked with src=None selevel=None force=False setype=None _original_basename=None unsafe_writes=False access_time=None seuser=None recurse=False state=directory access_time_format=%Y%m%d%H%M.%S group=root modification_time=None serole=None _diff_peek=None modification_time_format=%Y%m%d%H%M.%S path=/etc/ansible/facts.d owner=root follow=True attributes=None mode=0755 Feb 17 07:26:28 managed-node01 ansible-ansible.legacy.stat[3790]: Invoked with checksum_algorithm=sha1 get_checksum=True path=/etc/ansible/facts.d/pre_ripu.fact follow=False get_md5=False get_mime=True get_attributes=True Feb 17 07:26:28 managed-node01 ansible-ansible.legacy.copy[3836]: Invoked with src=/root/.ansible/tmp/ansible-tmp-1771331188.5295734-5707-257249652902284/source directory_mode=None force=True attributes=None remote_src=None unsafe_writes=False dest=/etc/ansible/facts.d/pre_ripu.fact seuser=None setype=None group=root content=NOT_LOGGING_PARAMETER _original_basename=tmpr9x3mfs8 serole=None mode=0644 selevel=None owner=root follow=False validate=None checksum=0d34de248f22fbf122a7a74e445d4f9f0f2843f2 backup=False local_follow=None Feb 17 07:26:29 managed-node01 ansible-ansible.legacy.command[3897]: Invoked with executable=None _uses_shell=True strip_empty_ends=True _raw_params=set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\.|+]el7' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort removes=None argv=None creates=None chdir=None stdin_add_newline=True stdin=None Feb 17 07:26:30 managed-node01 ansible-ansible.legacy.stat[3963]: Invoked with checksum_algorithm=sha1 get_checksum=True path=/etc/ansible/facts.d/non_rhel_packages.fact follow=False get_md5=False get_mime=True get_attributes=True Feb 17 07:26:30 managed-node01 ansible-ansible.legacy.copy[4009]: Invoked with src=/root/.ansible/tmp/ansible-tmp-1771331189.8450851-5737-143351151801779/source directory_mode=None force=True attributes=None remote_src=None unsafe_writes=False dest=/etc/ansible/facts.d/non_rhel_packages.fact seuser=None setype=None group=root content=NOT_LOGGING_PARAMETER _original_basename=tmpgkme9eq4 serole=None mode=0644 selevel=None owner=root follow=False validate=None checksum=6d36b22d9c2b2f366fc090edfbac427c77d524a5 backup=False local_follow=None Feb 17 07:26:31 managed-node01 ansible-ansible.legacy.yum[4070]: Invoked with lock_timeout=30 update_cache=False conf_file=None exclude=[] allow_downgrade=False sslverify=True disable_gpg_check=False disable_excludes=None use_backend=auto validate_certs=True state=latest disablerepo=[] releasever=None skip_broken=False cacheonly=False autoremove=False download_dir=None installroot=/ install_weak_deps=True name=['leapp-upgrade'] download_only=False bugfix=False list=None install_repoquery=True update_only=False disable_plugin=[] enablerepo=['rhel-7-server-extras-rpms'] security=False enable_plugin=[] Feb 17 07:26:35 managed-node01 yum[4088]: Installed: json-glib-1.4.2-2.el7.x86_64 Feb 17 07:26:35 managed-node01 yum[4088]: Installed: libmodulemd-1.6.3-1.el7.x86_64 Feb 17 07:26:35 managed-node01 yum[4088]: Installed: librhsm-0.0.3-3.el7_9.x86_64 Feb 17 07:26:35 managed-node01 yum[4088]: Installed: librepo-1.8.1-8.el7_9.x86_64 Feb 17 07:26:35 managed-node01 yum[4088]: Installed: libsolv-0.6.34-4.el7.x86_64 Feb 17 07:26:35 managed-node01 yum[4088]: Installed: libdnf-0.22.5-2.el7_9.x86_64 Feb 17 07:26:35 managed-node01 yum[4088]: Installed: python2-libdnf-0.22.5-2.el7_9.x86_64 Feb 17 07:26:35 managed-node01 yum[4088]: Installed: python2-hawkey-0.22.5-2.el7_9.x86_64 Feb 17 07:26:35 managed-node01 yum[4088]: Installed: leapp-deps-0.17.0-2.el7_9.noarch Feb 17 07:26:35 managed-node01 yum[4088]: Installed: python2-leapp-0.17.0-2.el7_9.noarch Feb 17 07:26:35 managed-node01 yum[4088]: Installed: libcomps-0.1.8-14.el7.x86_64 Feb 17 07:26:35 managed-node01 yum[4088]: Installed: python2-libcomps-0.1.8-14.el7.x86_64 Feb 17 07:26:35 managed-node01 yum[4088]: Installed: libreport-filesystem-2.1.11-53.el7.x86_64 Feb 17 07:26:35 managed-node01 yum[4088]: Installed: dnf-data-4.0.9.2-2.el7_9.noarch Feb 17 07:26:36 managed-node01 yum[4088]: Installed: python2-dnf-4.0.9.2-2.el7_9.noarch Feb 17 07:26:36 managed-node01 systemd[1]: Reloading. Feb 17 07:26:36 managed-node01 yum[4088]: Installed: dnf-4.0.9.2-2.el7_9.noarch Feb 17 07:26:36 managed-node01 yum[4088]: Installed: leapp-upgrade-el7toel8-deps-0.20.0-9.el7_9.noarch Feb 17 07:26:36 managed-node01 yum[4088]: Installed: leapp-0.17.0-2.el7_9.noarch Feb 17 07:26:37 managed-node01 yum[4088]: Installed: leapp-upgrade-el7toel8-0.20.0-9.el7_9.noarch Feb 17 07:26:37 managed-node01 ansible-ansible.builtin.file[4199]: Invoked with src=None selevel=None force=False setype=None _original_basename=None unsafe_writes=False access_time=None seuser=None recurse=False state=directory access_time_format=%Y%m%d%H%M.%S group=root modification_time=None serole=None _diff_peek=None modification_time_format=%Y%m%d%H%M.%S path=/var/log/leapp owner=root follow=True attributes=None mode=0700 Feb 17 07:26:38 managed-node01 ansible-async_wrapper.py[4267]: Invoked with j849430504225 7200 /root/.ansible/tmp/ansible-tmp-1771331198.099378-5811-165490129583070/AnsiballZ_command.py _ Feb 17 07:26:38 managed-node01 ansible-async_wrapper.py[4270]: Starting module and watcher Feb 17 07:26:38 managed-node01 ansible-async_wrapper.py[4270]: Start watching 4271 (7200) Feb 17 07:26:38 managed-node01 ansible-async_wrapper.py[4271]: Start module (4271) Feb 17 07:26:38 managed-node01 ansible-async_wrapper.py[4267]: Return async_wrapper task started. Feb 17 07:26:38 managed-node01 ansible-ansible.legacy.command[4272]: Invoked with executable=/bin/bash _uses_shell=True strip_empty_ends=True _raw_params=set -o pipefail; export PATH=$PATH; ulimit -n 16384; leapp preupgrade --report-schema=1.2.0 2>&1 | tee -a /var/log/ripu/ripu.log removes=None argv=None creates=None chdir=None stdin_add_newline=True stdin=None Feb 17 07:26:43 managed-node01 ansible-async_wrapper.py[4270]: 4271 still running (7200) Feb 17 07:26:48 managed-node01 ansible-async_wrapper.py[4270]: 4271 still running (7195) Feb 17 07:26:51 managed-node01 dbus[544]: [system] Activating service name='com.redhat.SubscriptionManager' (using servicehelper) Feb 17 07:26:52 managed-node01 dbus[544]: [system] Successfully activated service 'com.redhat.SubscriptionManager' Feb 17 07:26:53 managed-node01 ansible-async_wrapper.py[4270]: 4271 still running (7190) Feb 17 07:26:53 managed-node01 systemd[1]: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 5991 (sysctl) Feb 17 07:26:53 managed-node01 systemd[1]: Mounting Arbitrary Executable File Formats File System... -- Subject: Unit proc-sys-fs-binfmt_misc.mount has begun start-up -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- -- Unit proc-sys-fs-binfmt_misc.mount has begun starting up. Feb 17 07:26:53 managed-node01 systemd[1]: Mounted Arbitrary Executable File Formats File System. -- Subject: Unit proc-sys-fs-binfmt_misc.mount has finished start-up -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- -- Unit proc-sys-fs-binfmt_misc.mount has finished starting up. -- -- The start-up result is done. Feb 17 07:26:53 managed-node01 kernel: nr_pdflush_threads exported in /proc is scheduled for removal Feb 17 07:26:58 managed-node01 ansible-async_wrapper.py[4270]: 4271 still running (7185) Feb 17 07:27:03 managed-node01 ansible-async_wrapper.py[4270]: 4271 still running (7180) Feb 17 07:27:08 managed-node01 ansible-async_wrapper.py[4270]: 4271 still running (7175) Feb 17 07:27:13 managed-node01 ansible-async_wrapper.py[4270]: 4271 still running (7170) Feb 17 07:27:18 managed-node01 ansible-async_wrapper.py[4270]: 4271 still running (7165) Feb 17 07:27:18 managed-node01 ansible-async_wrapper.py[4271]: Module complete (4271) Feb 17 07:27:23 managed-node01 ansible-async_wrapper.py[4270]: Done in kid B. Feb 17 07:27:37 managed-node01 sshd[3338]: Received disconnect from 10.31.8.42 port 52688:11: disconnected by user Feb 17 07:27:37 managed-node01 sshd[3338]: Disconnected from 10.31.8.42 port 52688 Feb 17 07:27:37 managed-node01 sshd[3338]: pam_unix(sshd:session): session closed for user root Feb 17 07:27:37 managed-node01 systemd-logind[537]: Removed session 6. -- Subject: Session 6 has been terminated -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- Documentation: http://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 6 has been terminated. Feb 17 07:27:38 managed-node01 sshd[8354]: Accepted publickey for root from 10.31.8.42 port 59102 ssh2: ECDSA SHA256:D3axHAvFbwi9dHfRTrLRFOkIDtPvUdC3d0Gx6RR6uB0 Feb 17 07:27:38 managed-node01 systemd-logind[537]: New session 7 of user root. -- Subject: A new session 7 has been created for user root -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- Documentation: http://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 7 has been created for the user root. -- -- The leading process of the session is 8354. Feb 17 07:27:38 managed-node01 systemd[1]: Started Session 7 of user root. -- Subject: Unit session-7.scope has finished start-up -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- -- Unit session-7.scope has finished starting up. -- -- The start-up result is done. Feb 17 07:27:38 managed-node01 sshd[8354]: pam_unix(sshd:session): session opened for user root by (uid=0) Feb 17 07:27:39 managed-node01 ansible-ansible.legacy.async_status[8415]: Invoked with jid=j849430504225.4267 mode=status _async_dir=/root/.ansible_async Feb 17 07:27:39 managed-node01 ansible-ansible.legacy.async_status[8453]: Invoked with jid=j849430504225.4267 mode=cleanup _async_dir=/root/.ansible_async Feb 17 07:27:39 managed-node01 ansible-ansible.builtin.stat[8515]: Invoked with checksum_algorithm=sha1 get_checksum=True follow=False path=/var/log/leapp/leapp-report.txt get_md5=False get_mime=True get_attributes=True Feb 17 07:27:40 managed-node01 ansible-ansible.legacy.command[8700]: Invoked with executable=None _uses_shell=False strip_empty_ends=True _raw_params=awk '/\(inhibitor\)/,/^-------/' /var/log/leapp/leapp-report.txt removes=None argv=None creates=None chdir=None stdin_add_newline=True stdin=None Feb 17 07:27:41 managed-node01 ansible-ansible.legacy.command[8763]: Invoked with executable=None _uses_shell=False strip_empty_ends=True _raw_params=awk '/high \(error\)/,/^-------/' /var/log/leapp/leapp-report.txt removes=None argv=None creates=None chdir=None stdin_add_newline=True stdin=None Feb 17 07:27:41 managed-node01 ansible-ansible.legacy.stat[8825]: Invoked with checksum_algorithm=sha1 get_checksum=True path=/var/log/leapp/leapp-report.json follow=True get_md5=False get_mime=True get_attributes=True Feb 17 07:27:41 managed-node01 ansible-ansible.legacy.stat[8895]: Invoked with checksum_algorithm=sha1 get_checksum=True path=/var/log/leapp/leapp-report.txt follow=True get_md5=False get_mime=True get_attributes=True Feb 17 07:27:42 managed-node01 ansible-ansible.legacy.command[8966]: Invoked with executable=/bin/bash _uses_shell=True strip_empty_ends=True _raw_params=set -euxo pipefail rm -f /var/log/leapp/leapp-upgrade.log rm -f /var/log/ripu/ripu.log* removes=None argv=None creates=None chdir=None stdin_add_newline=True stdin=None Feb 17 07:27:42 managed-node01 sshd[8979]: Accepted publickey for root from 10.31.8.42 port 59104 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Feb 17 07:27:42 managed-node01 systemd[1]: Started Session 8 of user root. -- Subject: Unit session-8.scope has finished start-up -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- -- Unit session-8.scope has finished starting up. -- -- The start-up result is done. Feb 17 07:27:42 managed-node01 systemd-logind[537]: New session 8 of user root. -- Subject: A new session 8 has been created for user root -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- Documentation: http://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 8 has been created for the user root. -- -- The leading process of the session is 8979. Feb 17 07:27:42 managed-node01 sshd[8979]: pam_unix(sshd:session): session opened for user root by (uid=0) Feb 17 07:27:42 managed-node01 sshd[8979]: Received disconnect from 10.31.8.42 port 59104:11: disconnected by user Feb 17 07:27:42 managed-node01 sshd[8979]: Disconnected from 10.31.8.42 port 59104 Feb 17 07:27:42 managed-node01 sshd[8979]: pam_unix(sshd:session): session closed for user root Feb 17 07:27:42 managed-node01 systemd-logind[537]: Removed session 8. -- Subject: Session 8 has been terminated -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- Documentation: http://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 8 has been terminated. Feb 17 07:27:42 managed-node01 sshd[8991]: Accepted publickey for root from 10.31.8.42 port 59120 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Feb 17 07:27:42 managed-node01 systemd[1]: Started Session 9 of user root. -- Subject: Unit session-9.scope has finished start-up -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- -- Unit session-9.scope has finished starting up. -- -- The start-up result is done. Feb 17 07:27:42 managed-node01 systemd-logind[537]: New session 9 of user root. -- Subject: A new session 9 has been created for user root -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- Documentation: http://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 9 has been created for the user root. -- -- The leading process of the session is 8991. Feb 17 07:27:42 managed-node01 sshd[8991]: pam_unix(sshd:session): session opened for user root by (uid=0)