[WARNING]: Collection infra.leapp does not support Ansible version 2.14.18 [WARNING]: running playbook inside collection infra.leapp [WARNING]: Collection community.general does not support Ansible version 2.14.18 ansible-playbook [core 2.14.18] config file = /etc/ansible/ansible.cfg configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python3.9/site-packages/ansible ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections executable location = /usr/bin/ansible-playbook python version = 3.9.23 (main, Aug 19 2025, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-11)] (/usr/bin/python3) jinja version = 3.1.2 libyaml = True Using /etc/ansible/ansible.cfg as config file Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_default.yml **************************************************** 1 plays in /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/tests_default.yml PLAY [Test] ******************************************************************** TASK [Gathering Facts] ********************************************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/tests_default.yml:2 ok: [managed-node01] TASK [infra.leapp.common : Log directory exists] ******************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:3 ok: [managed-node01] => {"changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/var/log/ripu", "secontext": "unconfined_u:object_r:var_log_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : Check for existing log file] ************************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:11 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : Fail if log file already exists] ******************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:16 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : Create new log file] ******************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:23 NOTIFIED HANDLER infra.leapp.common : Check for log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Add end time to log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Slurp ripu.log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Decode ripu.log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Rename log file for managed-node01 changed: [managed-node01] => {"changed": true, "checksum": "673af3d02ce91d427364dc0ada5e7ffcc35a9dd5", "dest": "/var/log/ripu/ripu.log", "gid": 0, "group": "root", "md5sum": "1c2a9059edcb1647c4770b36b84225b9", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 61, "src": "/root/.ansible/tmp/ansible-tmp-1764192688.6229005-12254-226189522363991/source", "state": "file", "uid": 0} TASK [infra.leapp.common : /etc/ansible/facts.d directory exists] ************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:35 ok: [managed-node01] => {"changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/ansible/facts.d", "secontext": "unconfined_u:object_r:etc_t:s0", "size": 86, "state": "directory", "uid": 0} TASK [infra.leapp.common : Capture current ansible_facts for validation after upgrade] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:43 changed: [managed-node01] => {"changed": true, "checksum": "7844e973c7e4971475e20576ca75f492c991c406", "dest": "/etc/ansible/facts.d/pre_ripu.fact", "gid": 0, "group": "root", "md5sum": "dad46294b3d7adafc5240f30dc7df086", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 14151, "src": "/root/.ansible/tmp/ansible-tmp-1764192689.8072124-12282-12841330646624/source", "state": "file", "uid": 0} TASK [infra.leapp.common : Capture a list of non-rhel versioned packages] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:51 ok: [managed-node01] => {"changed": false, "cmd": "set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\\.|+]el10' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort", "delta": "0:00:00.331952", "end": "2025-11-26 16:31:31.180041", "failed_when_result": false, "msg": "non-zero return code", "rc": 1, "start": "2025-11-26 16:31:30.848089", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [infra.leapp.common : Create fact with the non-rhel versioned packages list] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:65 ok: [managed-node01] => {"ansible_facts": {"non_rhel_packages": []}, "changed": false} TASK [infra.leapp.common : Capture the list of non-rhel versioned packages in a separate fact file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:69 ok: [managed-node01] => {"changed": false, "checksum": "97d170e1550eee4afc0af065b78cda302a97674c", "dest": "/etc/ansible/facts.d/non_rhel_packages.fact", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "path": "/etc/ansible/facts.d/non_rhel_packages.fact", "secontext": "system_u:object_r:etc_t:s0", "size": 2, "state": "file", "uid": 0} TASK [infra.leapp.analysis : Include tasks for preupg assistant analysis] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:9 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : Include tasks for leapp preupgrade analysis] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:13 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml for managed-node01 TASK [infra.leapp.analysis : analysis-leapp | Register to leapp activation key] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:2 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [analysis-leapp | Include custom_local_repos for local_repos_pre_leapp] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:14 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : analysis-leapp | Install packages for preupgrade analysis on RHEL 7] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:22 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : analysis-leapp | Install packages for preupgrade analysis on RHEL 8] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:29 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : analysis-leapp | Install packages for preupgrade analysis on RHEL 9] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:36 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : analysis-leapp | Ensure leapp log directory exists] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:43 ok: [managed-node01] => {"changed": false, "gid": 0, "group": "root", "mode": "0700", "owner": "root", "path": "/var/log/leapp", "secontext": "system_u:object_r:var_log_t:s0", "size": 145, "state": "directory", "uid": 0} TASK [infra.leapp.analysis : analysis-leapp | Populate leapp_answers file] ***** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:51 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [analysis-leapp | Create /etc/leapp/files/leapp_upgrade_repositories.repo] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:60 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : analysis-leapp | Leapp preupgrade report] ********* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:71 ASYNC FAILED on managed-node01: jid=j712132648771.10260 fatal: [managed-node01]: FAILED! => {"ansible_job_id": "j712132648771.10260", "changed": true, "cmd": "set -o pipefail; export PATH=$PATH; ulimit -n 16384; leapp preupgrade --report-schema=1.2.0 2>&1 | tee -a /var/log/ripu/ripu.log\n", "delta": "0:00:00.004849", "end": "2025-11-26 16:31:33.188355", "failed_when_result": true, "finished": 1, "msg": "non-zero return code", "rc": 127, "results_file": "/root/.ansible_async/j712132648771.10260", "start": "2025-11-26 16:31:33.183506", "started": 1, "stderr": "", "stderr_lines": [], "stdout": "/bin/bash: line 1: leapp: command not found", "stdout_lines": ["/bin/bash: line 1: leapp: command not found"]} PLAY RECAP ********************************************************************* managed-node01 : ok=11 changed=2 unreachable=0 failed=1 skipped=9 rescued=0 ignored=0 Nov 26 16:31:27 managed-node01 python3[8747]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 26 16:31:28 managed-node01 python3[8909]: ansible-ansible.builtin.file Invoked with path=/var/log/ripu state=directory owner=root group=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 16:31:28 managed-node01 python3[9040]: ansible-ansible.builtin.stat Invoked with path=/var/log/ripu/ripu.log follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 26 16:31:28 managed-node01 python3[9171]: ansible-ansible.legacy.stat Invoked with path=/var/log/ripu/ripu.log follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 16:31:29 managed-node01 python3[9276]: ansible-ansible.legacy.copy Invoked with dest=/var/log/ripu/ripu.log owner=root group=root mode=0644 src=/root/.ansible/tmp/ansible-tmp-1764192688.6229005-12254-226189522363991/source _original_basename=tmpj7jc8guz follow=False checksum=673af3d02ce91d427364dc0ada5e7ffcc35a9dd5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 16:31:29 managed-node01 python3[9407]: ansible-ansible.builtin.file Invoked with path=/etc/ansible/facts.d state=directory mode=0755 owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 16:31:30 managed-node01 python3[9538]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/pre_ripu.fact follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 16:31:30 managed-node01 python3[9645]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/pre_ripu.fact mode=0644 owner=root group=root src=/root/.ansible/tmp/ansible-tmp-1764192689.8072124-12282-12841330646624/source _original_basename=tmp2wsos940 follow=False checksum=7844e973c7e4971475e20576ca75f492c991c406 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 16:31:30 managed-node01 python3[9776]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\.|+]el10' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 26 16:31:31 managed-node01 python3[9912]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/non_rhel_packages.fact follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 26 16:31:31 managed-node01 python3[9978]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/etc/ansible/facts.d/non_rhel_packages.fact _original_basename=tmpbd0irp5m recurse=False state=file path=/etc/ansible/facts.d/non_rhel_packages.fact force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 16:31:32 managed-node01 python3[10109]: ansible-ansible.builtin.file Invoked with path=/var/log/leapp state=directory owner=root group=root mode=0700 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 26 16:31:32 managed-node01 ansible-async_wrapper.py[10260]: Invoked with j712132648771 7200 /root/.ansible/tmp/ansible-tmp-1764192692.5873675-12358-93824483780108/AnsiballZ_command.py _ Nov 26 16:31:33 managed-node01 ansible-async_wrapper.py[10263]: Starting module and watcher Nov 26 16:31:33 managed-node01 ansible-async_wrapper.py[10263]: Start watching 10264 (7200) Nov 26 16:31:33 managed-node01 ansible-async_wrapper.py[10264]: Start module (10264) Nov 26 16:31:33 managed-node01 ansible-async_wrapper.py[10260]: Return async_wrapper task started. Nov 26 16:31:33 managed-node01 python3[10265]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=set -o pipefail; export PATH=$PATH; ulimit -n 16384; leapp preupgrade --report-schema=1.2.0 2>&1 | tee -a /var/log/ripu/ripu.log _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None Nov 26 16:31:33 managed-node01 ansible-async_wrapper.py[10264]: Module complete (10264) Nov 26 16:31:38 managed-node01 ansible-async_wrapper.py[10263]: Done in kid B. Nov 26 16:32:32 managed-node01 sshd-session[8174]: Received disconnect from 10.31.42.200 port 56200:11: disconnected by user Nov 26 16:32:32 managed-node01 sshd-session[8174]: Disconnected from user root 10.31.42.200 port 56200 Nov 26 16:32:32 managed-node01 sshd-session[8171]: pam_unix(sshd:session): session closed for user root Nov 26 16:32:32 managed-node01 systemd[1]: session-3.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-3.scope has successfully entered the 'dead' state. Nov 26 16:32:32 managed-node01 systemd[1]: session-3.scope: Consumed 5.407s CPU time, 47.4M memory peak. ░░ Subject: Resources consumed by unit runtime ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-3.scope completed and consumed the indicated resources. Nov 26 16:32:32 managed-node01 systemd-logind[677]: Session 3 logged out. Waiting for processes to exit. Nov 26 16:32:32 managed-node01 systemd-logind[677]: Removed session 3. ░░ Subject: Session 3 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 3 has been terminated. Nov 26 16:32:33 managed-node01 sshd-session[10270]: Accepted publickey for root from 10.31.42.200 port 35210 ssh2: ECDSA SHA256:pyLt3QlY0Ji0Q/eNKgQYyk34PQetRaFnjqq3Hq5pH2s Nov 26 16:32:33 managed-node01 systemd-logind[677]: New session 6 of user root. ░░ Subject: A new session 6 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 6 has been created for the user root. ░░ ░░ The leading process of the session is 10270. Nov 26 16:32:33 managed-node01 systemd[1]: Started session-6.scope - Session 6 of User root. ░░ Subject: A start job for unit session-6.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-6.scope has finished successfully. ░░ ░░ The job identifier is 1419. Nov 26 16:32:33 managed-node01 sshd-session[10270]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Nov 26 16:32:33 managed-node01 python3[10403]: ansible-ansible.legacy.async_status Invoked with jid=j712132648771.10260 mode=status _async_dir=/root/.ansible_async Nov 26 16:32:33 managed-node01 python3[10487]: ansible-ansible.legacy.async_status Invoked with jid=j712132648771.10260 mode=cleanup _async_dir=/root/.ansible_async Nov 26 16:32:34 managed-node01 sshd-session[10509]: Accepted publickey for root from 10.31.42.200 port 35226 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Nov 26 16:32:34 managed-node01 systemd-logind[677]: New session 7 of user root. ░░ Subject: A new session 7 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 7 has been created for the user root. ░░ ░░ The leading process of the session is 10509. Nov 26 16:32:34 managed-node01 systemd[1]: Started session-7.scope - Session 7 of User root. ░░ Subject: A start job for unit session-7.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-7.scope has finished successfully. ░░ ░░ The job identifier is 1526. Nov 26 16:32:34 managed-node01 sshd-session[10509]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Nov 26 16:32:34 managed-node01 sshd-session[10512]: Received disconnect from 10.31.42.200 port 35226:11: disconnected by user Nov 26 16:32:34 managed-node01 sshd-session[10512]: Disconnected from user root 10.31.42.200 port 35226 Nov 26 16:32:34 managed-node01 sshd-session[10509]: pam_unix(sshd:session): session closed for user root Nov 26 16:32:34 managed-node01 systemd-logind[677]: Session 7 logged out. Waiting for processes to exit. Nov 26 16:32:34 managed-node01 systemd[1]: session-7.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-7.scope has successfully entered the 'dead' state. Nov 26 16:32:34 managed-node01 systemd-logind[677]: Removed session 7. ░░ Subject: Session 7 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 7 has been terminated. Nov 26 16:32:34 managed-node01 sshd-session[10535]: Accepted publickey for root from 10.31.42.200 port 35234 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Nov 26 16:32:34 managed-node01 systemd-logind[677]: New session 8 of user root. ░░ Subject: A new session 8 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 8 has been created for the user root. ░░ ░░ The leading process of the session is 10535. Nov 26 16:32:34 managed-node01 systemd[1]: Started session-8.scope - Session 8 of User root. ░░ Subject: A start job for unit session-8.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-8.scope has finished successfully. ░░ ░░ The job identifier is 1633. Nov 26 16:32:34 managed-node01 sshd-session[10535]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0)