[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 11661 1726882370.65645: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Xyq executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 11661 1726882370.66125: Added group all to inventory 11661 1726882370.66128: Added group ungrouped to inventory 11661 1726882370.66132: Group all now contains ungrouped 11661 1726882370.66135: Examining possible inventory source: /tmp/network-91m/inventory.yml 11661 1726882371.00289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 11661 1726882371.00348: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 11661 1726882371.00375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 11661 1726882371.00439: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 11661 1726882371.00521: Loaded config def from plugin (inventory/script) 11661 1726882371.00523: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 11661 1726882371.00568: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 11661 1726882371.00667: Loaded config def from plugin (inventory/yaml) 11661 1726882371.00670: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 11661 1726882371.00761: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 11661 1726882371.01182: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 11661 1726882371.01186: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 11661 1726882371.01189: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 11661 1726882371.01194: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 11661 1726882371.01199: Loading data from /tmp/network-91m/inventory.yml 11661 1726882371.01269: /tmp/network-91m/inventory.yml was not parsable by auto 11661 1726882371.01335: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 11661 1726882371.01381: Loading data from /tmp/network-91m/inventory.yml 11661 1726882371.01457: group all already in inventory 11661 1726882371.01469: set inventory_file for managed_node1 11661 1726882371.01473: set inventory_dir for managed_node1 11661 1726882371.01475: Added host managed_node1 to inventory 11661 1726882371.01477: Added host managed_node1 to group all 11661 1726882371.01478: set ansible_host for managed_node1 11661 1726882371.01479: set ansible_ssh_extra_args for managed_node1 11661 1726882371.01483: set inventory_file for managed_node2 11661 1726882371.01486: set inventory_dir for managed_node2 11661 1726882371.01487: Added host managed_node2 to inventory 11661 1726882371.01489: Added host managed_node2 to group all 11661 1726882371.01490: set ansible_host for managed_node2 11661 1726882371.01491: set ansible_ssh_extra_args for managed_node2 11661 1726882371.01493: set inventory_file for managed_node3 11661 1726882371.01496: set inventory_dir for managed_node3 11661 1726882371.01497: Added host managed_node3 to inventory 11661 1726882371.01498: Added host managed_node3 to group all 11661 1726882371.01499: set ansible_host for managed_node3 11661 1726882371.01500: set ansible_ssh_extra_args for managed_node3 11661 1726882371.01503: Reconcile groups and hosts in inventory. 11661 1726882371.01507: Group ungrouped now contains managed_node1 11661 1726882371.01509: Group ungrouped now contains managed_node2 11661 1726882371.01511: Group ungrouped now contains managed_node3 11661 1726882371.01723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 11661 1726882371.01975: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 11661 1726882371.02141: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 11661 1726882371.02176: Loaded config def from plugin (vars/host_group_vars) 11661 1726882371.02178: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 11661 1726882371.02185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 11661 1726882371.02194: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 11661 1726882371.02282: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 11661 1726882371.03037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882371.03148: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 11661 1726882371.03236: Loaded config def from plugin (connection/local) 11661 1726882371.03239: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 11661 1726882371.03901: Loaded config def from plugin (connection/paramiko_ssh) 11661 1726882371.03904: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 11661 1726882371.04861: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11661 1726882371.04903: Loaded config def from plugin (connection/psrp) 11661 1726882371.04906: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 11661 1726882371.05727: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11661 1726882371.05772: Loaded config def from plugin (connection/ssh) 11661 1726882371.05775: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 11661 1726882371.08162: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11661 1726882371.08215: Loaded config def from plugin (connection/winrm) 11661 1726882371.08218: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 11661 1726882371.08252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 11661 1726882371.08327: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 11661 1726882371.08399: Loaded config def from plugin (shell/cmd) 11661 1726882371.08401: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 11661 1726882371.08431: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 11661 1726882371.08502: Loaded config def from plugin (shell/powershell) 11661 1726882371.08504: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 11661 1726882371.08567: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 11661 1726882371.08798: Loaded config def from plugin (shell/sh) 11661 1726882371.08800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 11661 1726882371.08840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 11661 1726882371.09171: Loaded config def from plugin (become/runas) 11661 1726882371.09174: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 11661 1726882371.09384: Loaded config def from plugin (become/su) 11661 1726882371.09387: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 11661 1726882371.09570: Loaded config def from plugin (become/sudo) 11661 1726882371.09572: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 11661 1726882371.09606: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml 11661 1726882371.09976: in VariableManager get_vars() 11661 1726882371.09999: done with get_vars() 11661 1726882371.10135: trying /usr/local/lib/python3.12/site-packages/ansible/modules 11661 1726882371.14101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 11661 1726882371.14217: in VariableManager get_vars() 11661 1726882371.14222: done with get_vars() 11661 1726882371.14224: variable 'playbook_dir' from source: magic vars 11661 1726882371.14225: variable 'ansible_playbook_python' from source: magic vars 11661 1726882371.14226: variable 'ansible_config_file' from source: magic vars 11661 1726882371.14227: variable 'groups' from source: magic vars 11661 1726882371.14227: variable 'omit' from source: magic vars 11661 1726882371.14228: variable 'ansible_version' from source: magic vars 11661 1726882371.14229: variable 'ansible_check_mode' from source: magic vars 11661 1726882371.14229: variable 'ansible_diff_mode' from source: magic vars 11661 1726882371.14230: variable 'ansible_forks' from source: magic vars 11661 1726882371.14231: variable 'ansible_inventory_sources' from source: magic vars 11661 1726882371.14231: variable 'ansible_skip_tags' from source: magic vars 11661 1726882371.14232: variable 'ansible_limit' from source: magic vars 11661 1726882371.14233: variable 'ansible_run_tags' from source: magic vars 11661 1726882371.14233: variable 'ansible_verbosity' from source: magic vars 11661 1726882371.14274: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml 11661 1726882371.15043: in VariableManager get_vars() 11661 1726882371.15065: done with get_vars() 11661 1726882371.15080: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11661 1726882371.16129: in VariableManager get_vars() 11661 1726882371.16144: done with get_vars() 11661 1726882371.16157: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11661 1726882371.16272: in VariableManager get_vars() 11661 1726882371.16293: done with get_vars() 11661 1726882371.16445: in VariableManager get_vars() 11661 1726882371.16462: done with get_vars() 11661 1726882371.16473: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11661 1726882371.16547: in VariableManager get_vars() 11661 1726882371.16565: done with get_vars() 11661 1726882371.16846: in VariableManager get_vars() 11661 1726882371.16862: done with get_vars() 11661 1726882371.16867: variable 'omit' from source: magic vars 11661 1726882371.16884: variable 'omit' from source: magic vars 11661 1726882371.16915: in VariableManager get_vars() 11661 1726882371.16929: done with get_vars() 11661 1726882371.16978: in VariableManager get_vars() 11661 1726882371.16991: done with get_vars() 11661 1726882371.17024: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11661 1726882371.17276: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11661 1726882371.17399: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11661 1726882371.18098: in VariableManager get_vars() 11661 1726882371.18121: done with get_vars() 11661 1726882371.18562: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 11661 1726882371.18712: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11661 1726882371.20384: in VariableManager get_vars() 11661 1726882371.20408: done with get_vars() 11661 1726882371.20418: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11661 1726882371.20517: in VariableManager get_vars() 11661 1726882371.20535: done with get_vars() 11661 1726882371.20668: in VariableManager get_vars() 11661 1726882371.20685: done with get_vars() 11661 1726882371.20992: in VariableManager get_vars() 11661 1726882371.21011: done with get_vars() 11661 1726882371.21016: variable 'omit' from source: magic vars 11661 1726882371.21041: variable 'omit' from source: magic vars 11661 1726882371.21092: in VariableManager get_vars() 11661 1726882371.21106: done with get_vars() 11661 1726882371.21127: in VariableManager get_vars() 11661 1726882371.21141: done with get_vars() 11661 1726882371.21179: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11661 1726882371.21312: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11661 1726882371.21398: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11661 1726882371.21894: in VariableManager get_vars() 11661 1726882371.21920: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11661 1726882371.23996: in VariableManager get_vars() 11661 1726882371.24015: done with get_vars() 11661 1726882371.24024: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11661 1726882371.26320: in VariableManager get_vars() 11661 1726882371.26342: done with get_vars() 11661 1726882371.26410: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 11661 1726882371.26425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 11661 1726882371.26671: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 11661 1726882371.26846: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 11661 1726882371.26852: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 11661 1726882371.26885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 11661 1726882371.26915: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 11661 1726882371.27097: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 11661 1726882371.27167: Loaded config def from plugin (callback/default) 11661 1726882371.27170: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 11661 1726882371.28387: Loaded config def from plugin (callback/junit) 11661 1726882371.28390: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 11661 1726882371.28438: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 11661 1726882371.28508: Loaded config def from plugin (callback/minimal) 11661 1726882371.28511: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 11661 1726882371.28553: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 11661 1726882371.28608: Loaded config def from plugin (callback/tree) 11661 1726882371.28611: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 11661 1726882371.28733: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 11661 1726882371.28736: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_nm.yml **************************************************** 2 plays in /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml 11661 1726882371.28772: in VariableManager get_vars() 11661 1726882371.28785: done with get_vars() 11661 1726882371.28790: in VariableManager get_vars() 11661 1726882371.28799: done with get_vars() 11661 1726882371.28802: variable 'omit' from source: magic vars 11661 1726882371.28837: in VariableManager get_vars() 11661 1726882371.28857: done with get_vars() 11661 1726882371.28880: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond.yml' with nm as provider] ************* 11661 1726882371.29461: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 11661 1726882371.29540: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 11661 1726882371.29577: getting the remaining hosts for this loop 11661 1726882371.29579: done getting the remaining hosts for this loop 11661 1726882371.29581: getting the next task for host managed_node2 11661 1726882371.29585: done getting next task for host managed_node2 11661 1726882371.29587: ^ task is: TASK: Gathering Facts 11661 1726882371.29588: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882371.29591: getting variables 11661 1726882371.29592: in VariableManager get_vars() 11661 1726882371.29600: Calling all_inventory to load vars for managed_node2 11661 1726882371.29603: Calling groups_inventory to load vars for managed_node2 11661 1726882371.29605: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882371.29619: Calling all_plugins_play to load vars for managed_node2 11661 1726882371.29630: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882371.29634: Calling groups_plugins_play to load vars for managed_node2 11661 1726882371.29673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882371.29725: done with get_vars() 11661 1726882371.29731: done getting variables 11661 1726882371.29796: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:6 Friday 20 September 2024 21:32:51 -0400 (0:00:00.011) 0:00:00.011 ****** 11661 1726882371.29817: entering _queue_task() for managed_node2/gather_facts 11661 1726882371.29818: Creating lock for gather_facts 11661 1726882371.30166: worker is 1 (out of 1 available) 11661 1726882371.30182: exiting _queue_task() for managed_node2/gather_facts 11661 1726882371.30197: done queuing things up, now waiting for results queue to drain 11661 1726882371.30203: waiting for pending results... 11661 1726882371.30698: running TaskExecutor() for managed_node2/TASK: Gathering Facts 11661 1726882371.30810: in run() - task 0e448fcc-3ce9-896b-2321-0000000000cc 11661 1726882371.30838: variable 'ansible_search_path' from source: unknown 11661 1726882371.30883: calling self._execute() 11661 1726882371.30956: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882371.30971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882371.30984: variable 'omit' from source: magic vars 11661 1726882371.31093: variable 'omit' from source: magic vars 11661 1726882371.31126: variable 'omit' from source: magic vars 11661 1726882371.31175: variable 'omit' from source: magic vars 11661 1726882371.31226: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882371.31280: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882371.31305: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882371.31327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882371.31343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882371.31387: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882371.31395: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882371.31403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882371.31514: Set connection var ansible_connection to ssh 11661 1726882371.31525: Set connection var ansible_pipelining to False 11661 1726882371.31536: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882371.31550: Set connection var ansible_timeout to 10 11661 1726882371.31559: Set connection var ansible_shell_type to sh 11661 1726882371.31574: Set connection var ansible_shell_executable to /bin/sh 11661 1726882371.31606: variable 'ansible_shell_executable' from source: unknown 11661 1726882371.31613: variable 'ansible_connection' from source: unknown 11661 1726882371.31620: variable 'ansible_module_compression' from source: unknown 11661 1726882371.31627: variable 'ansible_shell_type' from source: unknown 11661 1726882371.31633: variable 'ansible_shell_executable' from source: unknown 11661 1726882371.31639: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882371.31646: variable 'ansible_pipelining' from source: unknown 11661 1726882371.31657: variable 'ansible_timeout' from source: unknown 11661 1726882371.31668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882371.31891: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882371.31907: variable 'omit' from source: magic vars 11661 1726882371.31918: starting attempt loop 11661 1726882371.31930: running the handler 11661 1726882371.31952: variable 'ansible_facts' from source: unknown 11661 1726882371.31978: _low_level_execute_command(): starting 11661 1726882371.31991: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882371.32789: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882371.32810: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882371.32826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882371.32845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882371.32893: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882371.32912: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882371.32928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882371.32948: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882371.32967: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882371.32979: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882371.32990: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882371.33002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882371.33020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882371.33031: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882371.33040: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882371.33054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882371.33134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882371.33159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882371.33176: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882371.33307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882371.34972: stdout chunk (state=3): >>>/root <<< 11661 1726882371.35151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882371.35155: stdout chunk (state=3): >>><<< 11661 1726882371.35157: stderr chunk (state=3): >>><<< 11661 1726882371.35277: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882371.35280: _low_level_execute_command(): starting 11661 1726882371.35283: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882371.3518252-11704-145882618393862 `" && echo ansible-tmp-1726882371.3518252-11704-145882618393862="` echo /root/.ansible/tmp/ansible-tmp-1726882371.3518252-11704-145882618393862 `" ) && sleep 0' 11661 1726882371.35903: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882371.35931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882371.35951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882371.35973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882371.36015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882371.36028: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882371.36055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882371.36076: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882371.36089: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882371.36100: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882371.36113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882371.36126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882371.36152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882371.36169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882371.36183: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882371.36197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882371.36284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882371.36307: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882371.36324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882371.36458: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882371.38317: stdout chunk (state=3): >>>ansible-tmp-1726882371.3518252-11704-145882618393862=/root/.ansible/tmp/ansible-tmp-1726882371.3518252-11704-145882618393862 <<< 11661 1726882371.38495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882371.38498: stderr chunk (state=3): >>><<< 11661 1726882371.38501: stdout chunk (state=3): >>><<< 11661 1726882371.38769: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882371.3518252-11704-145882618393862=/root/.ansible/tmp/ansible-tmp-1726882371.3518252-11704-145882618393862 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882371.38772: variable 'ansible_module_compression' from source: unknown 11661 1726882371.38774: ANSIBALLZ: Using generic lock for ansible.legacy.setup 11661 1726882371.38777: ANSIBALLZ: Acquiring lock 11661 1726882371.38778: ANSIBALLZ: Lock acquired: 139652576276224 11661 1726882371.38780: ANSIBALLZ: Creating module 11661 1726882371.71098: ANSIBALLZ: Writing module into payload 11661 1726882371.71300: ANSIBALLZ: Writing module 11661 1726882371.71337: ANSIBALLZ: Renaming module 11661 1726882371.71351: ANSIBALLZ: Done creating module 11661 1726882371.71397: variable 'ansible_facts' from source: unknown 11661 1726882371.71416: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882371.71431: _low_level_execute_command(): starting 11661 1726882371.71441: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 11661 1726882371.72200: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882371.72216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882371.72232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882371.72256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882371.72308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882371.72321: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882371.72336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882371.72357: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882371.72376: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882371.72395: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882371.72409: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882371.72422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882371.72438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882371.72452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882371.72468: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882371.72488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882371.72577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882371.72612: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882371.72631: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882371.72776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882371.74437: stdout chunk (state=3): >>>PLATFORM <<< 11661 1726882371.74553: stdout chunk (state=3): >>>Linux <<< 11661 1726882371.74558: stdout chunk (state=3): >>>FOUND <<< 11661 1726882371.74561: stdout chunk (state=3): >>>/usr/bin/python3.9 <<< 11661 1726882371.74563: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 <<< 11661 1726882371.74570: stdout chunk (state=3): >>>ENDFOUND <<< 11661 1726882371.74731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882371.74787: stderr chunk (state=3): >>><<< 11661 1726882371.74802: stdout chunk (state=3): >>><<< 11661 1726882371.74992: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882371.75003 [managed_node2]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 11661 1726882371.75006: _low_level_execute_command(): starting 11661 1726882371.75008: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 11661 1726882371.75076: Sending initial data 11661 1726882371.75080: Sent initial data (1181 bytes) 11661 1726882371.75674: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882371.75689: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882371.75704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882371.75722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882371.75780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882371.75793: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882371.75807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882371.75825: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882371.75838: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882371.75854: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882371.75877: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882371.75893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882371.75910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882371.75922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882371.75933: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882371.75946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882371.76034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882371.76060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882371.76084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882371.76222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882371.79969: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 11661 1726882371.80570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882371.80574: stdout chunk (state=3): >>><<< 11661 1726882371.80576: stderr chunk (state=3): >>><<< 11661 1726882371.80579: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882371.80581: variable 'ansible_facts' from source: unknown 11661 1726882371.80583: variable 'ansible_facts' from source: unknown 11661 1726882371.80585: variable 'ansible_module_compression' from source: unknown 11661 1726882371.80703: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11661 1726882371.80706: variable 'ansible_facts' from source: unknown 11661 1726882371.80800: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882371.3518252-11704-145882618393862/AnsiballZ_setup.py 11661 1726882371.81452: Sending initial data 11661 1726882371.81456: Sent initial data (154 bytes) 11661 1726882371.85002: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882371.85016: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882371.85030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882371.85047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882371.85102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882371.85115: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882371.85130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882371.85148: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882371.85164: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882371.85182: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882371.85194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882371.85206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882371.85297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882371.85311: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882371.85323: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882371.85337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882371.85496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882371.85523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882371.85536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882371.85670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882371.87490: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882371.87586: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882371.87690: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmplxx57mqz /root/.ansible/tmp/ansible-tmp-1726882371.3518252-11704-145882618393862/AnsiballZ_setup.py <<< 11661 1726882371.87788: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882371.90651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882371.90739: stderr chunk (state=3): >>><<< 11661 1726882371.90742: stdout chunk (state=3): >>><<< 11661 1726882371.90767: done transferring module to remote 11661 1726882371.90783: _low_level_execute_command(): starting 11661 1726882371.90786: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882371.3518252-11704-145882618393862/ /root/.ansible/tmp/ansible-tmp-1726882371.3518252-11704-145882618393862/AnsiballZ_setup.py && sleep 0' 11661 1726882371.91496: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882371.91500: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882371.91511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882371.91613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882371.91616: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882371.91620: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882371.91623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882371.91625: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882371.91627: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882371.91629: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882371.91631: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882371.91633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882371.91665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882371.91668: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882371.91671: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882371.91677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882371.91758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882371.91776: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882371.91788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882371.91911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882371.93764: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882371.93768: stdout chunk (state=3): >>><<< 11661 1726882371.93775: stderr chunk (state=3): >>><<< 11661 1726882371.93793: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882371.93798: _low_level_execute_command(): starting 11661 1726882371.93801: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882371.3518252-11704-145882618393862/AnsiballZ_setup.py && sleep 0' 11661 1726882371.94417: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882371.94425: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882371.94441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882371.94454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882371.94494: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882371.94504: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882371.94529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882371.94533: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882371.94543: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882371.94553: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882371.94559: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882371.94574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882371.94582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882371.94590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882371.94598: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882371.94714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882371.94976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882371.94980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882371.94983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882371.94985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882371.96882: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 11661 1726882371.96888: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 11661 1726882371.96957: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 11661 1726882371.96983: stdout chunk (state=3): >>>import 'posix' # <<< 11661 1726882371.97010: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 11661 1726882371.97061: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 11661 1726882371.97109: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882371.97154: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 11661 1726882371.97166: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 11661 1726882371.97184: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e13f3dc0> <<< 11661 1726882371.97227: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 11661 1726882371.97255: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e13983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e13f3b20> <<< 11661 1726882371.97292: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 11661 1726882371.97295: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e13f3ac0> <<< 11661 1726882371.97330: stdout chunk (state=3): >>>import '_signal' # <<< 11661 1726882371.97354: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 11661 1726882371.97378: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e1398490> <<< 11661 1726882371.97405: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 11661 1726882371.97427: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e1398940> <<< 11661 1726882371.97442: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e1398670> <<< 11661 1726882371.97483: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 11661 1726882371.97487: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 11661 1726882371.97498: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 11661 1726882371.97542: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 11661 1726882371.97567: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 11661 1726882371.97588: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e134f190> <<< 11661 1726882371.97612: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 11661 1726882371.97623: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 11661 1726882371.98006: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e134f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e1372850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e134f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e13b0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e1348d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e1372d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e1398970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11661 1726882371.98308: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 11661 1726882371.98318: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 11661 1726882371.98369: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 11661 1726882371.98372: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 11661 1726882371.98409: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 11661 1726882371.98425: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 11661 1726882371.98428: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12eeeb0> <<< 11661 1726882371.98471: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12f0f40> <<< 11661 1726882371.98507: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 11661 1726882371.98519: stdout chunk (state=3): >>>import '_sre' # <<< 11661 1726882371.98534: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 11661 1726882371.98562: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 11661 1726882371.98591: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 11661 1726882371.98624: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12e7610> <<< 11661 1726882371.98632: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12ed640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12ee370> <<< 11661 1726882371.98644: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 11661 1726882371.98720: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 11661 1726882371.98733: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 11661 1726882371.98777: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882371.98825: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 11661 1726882371.98875: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0f91dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0f918b0> <<< 11661 1726882371.98887: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0f91eb0> <<< 11661 1726882371.98923: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 11661 1726882371.98939: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0f91f70> <<< 11661 1726882371.98986: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0f91e80> import '_collections' # <<< 11661 1726882371.99023: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12c9d30> import '_functools' # <<< 11661 1726882371.99130: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12c2610> <<< 11661 1726882371.99166: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12d6670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12f5e20> <<< 11661 1726882371.99214: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 11661 1726882371.99244: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0fa3c70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12c9250> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so'<<< 11661 1726882371.99296: stdout chunk (state=3): >>> import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e12d6280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12fb9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 11661 1726882371.99346: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 11661 1726882371.99377: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0fa3fa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0fa3d90> <<< 11661 1726882372.00078: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0fa3d00> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 11661 1726882372.00111: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0f76370> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0f76460> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0fabfa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0fa5a30> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0fa5490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0ec41c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0f61c70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0fa5eb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12fb040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0ed6af0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0ed6e20> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 11661 1726882372.00120: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 11661 1726882372.00137: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0ee8730> <<< 11661 1726882372.00158: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 11661 1726882372.00176: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 11661 1726882372.00205: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0ee8c70> <<< 11661 1726882372.00250: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0e813a0> <<< 11661 1726882372.00255: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0ed6f10> <<< 11661 1726882372.00276: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 11661 1726882372.00279: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 11661 1726882372.00332: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0e91280> <<< 11661 1726882372.00340: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0ee85b0> <<< 11661 1726882372.00343: stdout chunk (state=3): >>>import 'pwd' # <<< 11661 1726882372.00360: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0e91340> <<< 11661 1726882372.00416: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0fa39d0> <<< 11661 1726882372.00442: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 11661 1726882372.00445: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 11661 1726882372.00465: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 11661 1726882372.00476: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 11661 1726882372.00501: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' <<< 11661 1726882372.00540: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0ead6a0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py <<< 11661 1726882372.00544: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 11661 1726882372.00561: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0ead970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0ead760> <<< 11661 1726882372.00595: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0ead850> <<< 11661 1726882372.00623: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 11661 1726882372.00820: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0eadca0> <<< 11661 1726882372.00925: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0eb91f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0ead8e0> <<< 11661 1726882372.00927: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0ea0a30> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0fa35b0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 11661 1726882372.00980: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 11661 1726882372.01020: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0eada90> <<< 11661 1726882372.01163: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 11661 1726882372.01186: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f32e0dd5670> <<< 11661 1726882372.01416: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip' <<< 11661 1726882372.01419: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.01496: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.01555: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py <<< 11661 1726882372.01558: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.01563: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.01579: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 11661 1726882372.02794: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.03778: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07ee7c0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 11661 1726882372.03796: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 11661 1726882372.03821: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e07ee160> <<< 11661 1726882372.03866: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07ee280> <<< 11661 1726882372.03889: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07eef10> <<< 11661 1726882372.03914: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 11661 1726882372.03966: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07ee4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07eed30> <<< 11661 1726882372.03972: stdout chunk (state=3): >>>import 'atexit' # <<< 11661 1726882372.04027: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e07eef70> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 11661 1726882372.04048: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 11661 1726882372.04086: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07ee100> <<< 11661 1726882372.04102: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 11661 1726882372.04117: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 11661 1726882372.04138: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 11661 1726882372.04176: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 11661 1726882372.04180: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 11661 1726882372.04306: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07adee0> <<< 11661 1726882372.04312: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e06c70d0> <<< 11661 1726882372.04337: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e06c72b0> <<< 11661 1726882372.04363: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 11661 1726882372.04397: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e06c7c40> <<< 11661 1726882372.04409: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07d5dc0> <<< 11661 1726882372.04600: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07d53a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 11661 1726882372.04618: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07d5f70> <<< 11661 1726882372.04645: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 11661 1726882372.04649: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 11661 1726882372.04688: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 11661 1726882372.04715: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 11661 1726882372.04747: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0d48c10> <<< 11661 1726882372.04831: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07f6cd0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07f63a0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07a2b80> <<< 11661 1726882372.04861: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e07f64c0> <<< 11661 1726882372.04888: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07f64f0> <<< 11661 1726882372.04916: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 11661 1726882372.04922: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 11661 1726882372.04945: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 11661 1726882372.04980: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 11661 1726882372.05070: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0725250> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0d5a1f0> <<< 11661 1726882372.05083: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 11661 1726882372.05091: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 11661 1726882372.05138: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e07338e0> <<< 11661 1726882372.05144: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0d5a370> <<< 11661 1726882372.05173: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 11661 1726882372.05198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882372.05234: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 11661 1726882372.05241: stdout chunk (state=3): >>>import '_string' # <<< 11661 1726882372.05290: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0d5aca0> <<< 11661 1726882372.05428: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0733880> <<< 11661 1726882372.05518: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e07258b0> <<< 11661 1726882372.05547: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e07ce190> <<< 11661 1726882372.05601: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0d5a670> <<< 11661 1726882372.05627: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0d538b0> <<< 11661 1726882372.05653: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 11661 1726882372.05656: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 11661 1726882372.05699: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e07269d0> <<< 11661 1726882372.05890: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0744b80> <<< 11661 1726882372.05893: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0731640> <<< 11661 1726882372.05929: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 11661 1726882372.05958: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0726f70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0731a30> <<< 11661 1726882372.05962: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.05967: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.05969: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 11661 1726882372.05971: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.06056: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.06123: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.06143: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py <<< 11661 1726882372.06182: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11661 1726882372.06185: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 11661 1726882372.06188: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.06287: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.06379: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.06830: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.07300: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py <<< 11661 1726882372.07304: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 11661 1726882372.07335: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 11661 1726882372.07338: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882372.07392: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e076d7c0> <<< 11661 1726882372.07465: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 11661 1726882372.07469: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0772820> <<< 11661 1726882372.07479: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e02ba9a0> <<< 11661 1726882372.07515: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 11661 1726882372.07536: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.07567: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.07570: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py <<< 11661 1726882372.07572: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.07690: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.07826: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 11661 1726882372.07854: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07ac760> <<< 11661 1726882372.07857: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.08245: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.08610: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.08666: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.08729: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 11661 1726882372.08768: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.08797: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 11661 1726882372.08812: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.08864: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.08959: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py <<< 11661 1726882372.08965: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.08968: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 11661 1726882372.08990: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.09003: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.09053: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 11661 1726882372.09056: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.09236: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.09428: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 11661 1726882372.09463: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 11661 1726882372.09469: stdout chunk (state=3): >>>import '_ast' # <<< 11661 1726882372.09541: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07f03d0> # zipimport: zlib available <<< 11661 1726882372.09596: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.09680: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py <<< 11661 1726882372.09684: stdout chunk (state=3): >>>import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py <<< 11661 1726882372.09686: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 11661 1726882372.09696: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.09732: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.09769: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py <<< 11661 1726882372.09783: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.09807: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.09853: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.09941: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.10000: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 11661 1726882372.10021: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882372.10096: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e07649a0> <<< 11661 1726882372.10189: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e014e430> <<< 11661 1726882372.10233: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py <<< 11661 1726882372.10236: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py <<< 11661 1726882372.10239: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.10283: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.10339: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.10361: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.10400: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 11661 1726882372.10411: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 11661 1726882372.10425: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 11661 1726882372.10468: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 11661 1726882372.10493: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 11661 1726882372.10523: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 11661 1726882372.10582: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0775670> <<< 11661 1726882372.10625: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07c0d90> <<< 11661 1726882372.10689: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07f0400> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 11661 1726882372.10717: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.10743: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py <<< 11661 1726882372.10747: stdout chunk (state=3): >>>import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 11661 1726882372.10852: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 11661 1726882372.10855: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.10859: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.10862: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py <<< 11661 1726882372.10864: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.10914: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.10967: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.10986: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.10998: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.11037: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.11074: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.11105: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.11148: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 11661 1726882372.11154: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.11211: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.11279: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.11292: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.11330: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 11661 1726882372.11480: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.11618: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.11656: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.11692: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882372.11727: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 11661 1726882372.11730: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 11661 1726882372.11755: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py <<< 11661 1726882372.11770: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 11661 1726882372.11793: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e02e7ac0> <<< 11661 1726882372.11810: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 11661 1726882372.11832: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 11661 1726882372.11867: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 11661 1726882372.11897: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 11661 1726882372.11914: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 11661 1726882372.11918: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0298a90> <<< 11661 1726882372.11946: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0298a00> <<< 11661 1726882372.12009: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e02cf760> <<< 11661 1726882372.12024: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e02e7190> <<< 11661 1726882372.12049: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e003af10> <<< 11661 1726882372.12072: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e003aaf0> <<< 11661 1726882372.12088: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 11661 1726882372.12101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 11661 1726882372.12122: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 11661 1726882372.12157: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e07d1cd0> <<< 11661 1726882372.12172: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0287160> <<< 11661 1726882372.12196: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 11661 1726882372.12199: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 11661 1726882372.12240: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07d12e0> <<< 11661 1726882372.12243: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 11661 1726882372.12276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 11661 1726882372.12301: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' <<< 11661 1726882372.12308: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e00a2fa0> <<< 11661 1726882372.12319: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e02cbdc0> <<< 11661 1726882372.12344: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e003adc0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 11661 1726882372.12359: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py <<< 11661 1726882372.12377: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.12390: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 11661 1726882372.12402: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.12454: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.12506: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 11661 1726882372.12516: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.12559: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.12594: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 11661 1726882372.12612: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.12630: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 11661 1726882372.12645: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.12677: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.12691: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 11661 1726882372.12704: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.12744: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.12791: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available <<< 11661 1726882372.12830: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.12872: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 11661 1726882372.12880: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.12928: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.12986: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.13027: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.13089: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 11661 1726882372.13110: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.13483: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.13869: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 11661 1726882372.14001: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.14004: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.14009: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.14011: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 11661 1726882372.14018: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.14042: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.14076: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 11661 1726882372.14082: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.14129: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.14185: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 11661 1726882372.14194: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.14218: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.14244: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 11661 1726882372.14247: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.14278: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.14297: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 11661 1726882372.14308: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.14370: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.14446: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 11661 1726882372.14476: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e02bf670> <<< 11661 1726882372.14488: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 11661 1726882372.14515: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 11661 1726882372.14681: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32dffbcf10> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 11661 1726882372.14684: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.14742: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.14802: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 11661 1726882372.14805: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.14887: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.14958: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 11661 1726882372.15017: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.15096: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 11661 1726882372.15100: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.15122: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.15171: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 11661 1726882372.15186: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 11661 1726882372.15340: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32dffacc10> <<< 11661 1726882372.15583: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32dfff9b20> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 11661 1726882372.15587: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.15634: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.15687: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 11661 1726882372.15690: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.15770: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.15831: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.15923: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.16061: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 11661 1726882372.16068: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.16104: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.16143: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 11661 1726882372.16146: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.16180: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.16227: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 11661 1726882372.16271: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32dff354f0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32dff35a30> <<< 11661 1726882372.16298: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available <<< 11661 1726882372.16321: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 11661 1726882372.16325: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.16366: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.16404: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 11661 1726882372.16415: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.16539: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.16669: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 11661 1726882372.16752: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.16832: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.16869: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.16904: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 11661 1726882372.16916: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.17001: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.17012: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.17128: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.17245: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py <<< 11661 1726882372.17259: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 11661 1726882372.17368: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.17466: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 11661 1726882372.17483: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.17498: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.17537: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.17962: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.18369: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py <<< 11661 1726882372.18382: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available <<< 11661 1726882372.18471: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.18558: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 11661 1726882372.18572: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.18643: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.18729: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available <<< 11661 1726882372.18859: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.18986: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available <<< 11661 1726882372.19003: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 11661 1726882372.19018: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.19072: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.19096: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 11661 1726882372.19099: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.19186: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.19265: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.19435: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.19603: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py <<< 11661 1726882372.19606: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available <<< 11661 1726882372.19639: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.19680: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available <<< 11661 1726882372.19693: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.19733: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 11661 1726882372.19736: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.19787: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.20187: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 11661 1726882372.20190: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.20259: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available <<< 11661 1726882372.20342: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.20556: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 11661 1726882372.20559: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.20598: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.20660: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 11661 1726882372.20669: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.20686: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.20718: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 11661 1726882372.20740: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.20756: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.20791: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 11661 1726882372.20795: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.20824: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.20861: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 11661 1726882372.20869: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.20924: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.21002: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available <<< 11661 1726882372.21017: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 11661 1726882372.21032: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.21070: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.21125: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available <<< 11661 1726882372.21139: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.21151: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.21192: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.21232: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.21290: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.21359: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py <<< 11661 1726882372.21363: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 11661 1726882372.21385: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.21407: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.21466: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 11661 1726882372.21470: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.21639: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.21791: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 11661 1726882372.21794: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.21840: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.21885: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available <<< 11661 1726882372.21920: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.21967: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available <<< 11661 1726882372.22025: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.22109: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 11661 1726882372.22112: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.22181: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.22261: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py <<< 11661 1726882372.22272: stdout chunk (state=3): >>>import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 11661 1726882372.22352: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882372.22523: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 11661 1726882372.22526: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 11661 1726882372.22571: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32dfd7a0d0> <<< 11661 1726882372.22580: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32dfd7a910> <<< 11661 1726882372.22624: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32dfd7af40> <<< 11661 1726882372.24044: stdout chunk (state=3): >>>import 'gc' # <<< 11661 1726882372.26504: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 11661 1726882372.26507: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' <<< 11661 1726882372.26510: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32dfd7a0a0> <<< 11661 1726882372.26537: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 11661 1726882372.26567: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 11661 1726882372.26584: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32dff80f10> <<< 11661 1726882372.26637: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py <<< 11661 1726882372.26640: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882372.26676: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py <<< 11661 1726882372.26696: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32dff285b0> <<< 11661 1726882372.26699: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32dff286a0> <<< 11661 1726882372.26922: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 11661 1726882372.26926: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 11661 1726882372.51011: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "52", "epoch": "1726882372", "epoch_int": "1726882372", "date": "2024-09-20", "time": "21:32:52", "iso8601_micro": "2024-09-21T01:32:52.248352Z", "iso8601": "2024-09-21T01:32:52Z", "iso8601_basic": "20240920T213252248352", "iso8601_basic_short": "20240920T213252", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2801, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 731, "free": 2801}, "nocache": {"free": 3257, "used": 275}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 311, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264240525312, "block_size": 4096, "block_total": 65519355, "block_available": 64511847, "block_used": 1007508, "inode_total": 131071472, "inode_available": 130998721, "inode_used": 72751, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2a<<< 11661 1726882372.51024: stdout chunk (state=3): >>>fs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.37, "5m": 0.32, "15m": 0.15}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "n<<< 11661 1726882372.51040: stdout chunk (state=3): >>>etmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_service_mgr": "systemd", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11661 1726882372.51562: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc <<< 11661 1726882372.51683: stdout chunk (state=3): >>># cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external <<< 11661 1726882372.51695: stdout chunk (state=3): >>># cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref <<< 11661 1726882372.51702: stdout chunk (state=3): >>># cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma <<< 11661 1726882372.51708: stdout chunk (state=3): >>># cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437<<< 11661 1726882372.51856: stdout chunk (state=3): >>> # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal <<< 11661 1726882372.51861: stdout chunk (state=3): >>># cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text <<< 11661 1726882372.51869: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes <<< 11661 1726882372.51875: stdout chunk (state=3): >>># cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections <<< 11661 1726882372.51881: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool<<< 11661 1726882372.51948: stdout chunk (state=3): >>> # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue <<< 11661 1726882372.51958: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor <<< 11661 1726882372.51966: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime <<< 11661 1726882372.51972: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr<<< 11661 1726882372.52060: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd<<< 11661 1726882372.52067: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time <<< 11661 1726882372.52073: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware<<< 11661 1726882372.52104: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata <<< 11661 1726882372.52119: stdout chunk (state=3): >>># cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 11661 1726882372.52403: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11661 1726882372.52425: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 11661 1726882372.52456: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 11661 1726882372.52459: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 11661 1726882372.52492: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 11661 1726882372.52510: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 11661 1726882372.52527: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 11661 1726882372.52573: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 11661 1726882372.52619: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy <<< 11661 1726882372.52638: stdout chunk (state=3): >>># destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle <<< 11661 1726882372.52665: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.reduction <<< 11661 1726882372.52690: stdout chunk (state=3): >>># destroy shlex <<< 11661 1726882372.52710: stdout chunk (state=3): >>># destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux <<< 11661 1726882372.52730: stdout chunk (state=3): >>># destroy getpass # destroy json <<< 11661 1726882372.52762: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 11661 1726882372.52782: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection <<< 11661 1726882372.52814: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios <<< 11661 1726882372.52849: stdout chunk (state=3): >>># cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes <<< 11661 1726882372.52888: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess <<< 11661 1726882372.52913: stdout chunk (state=3): >>># cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading <<< 11661 1726882372.52932: stdout chunk (state=3): >>># cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 11661 1726882372.52961: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools <<< 11661 1726882372.53009: stdout chunk (state=3): >>># destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale <<< 11661 1726882372.53035: stdout chunk (state=3): >>># destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc <<< 11661 1726882372.53062: stdout chunk (state=3): >>># cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 11661 1726882372.53110: stdout chunk (state=3): >>># destroy gc <<< 11661 1726882372.53122: stdout chunk (state=3): >>># destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 11661 1726882372.53274: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse <<< 11661 1726882372.53292: stdout chunk (state=3): >>># destroy tokenize # destroy _heapq <<< 11661 1726882372.53315: stdout chunk (state=3): >>># destroy posixpath # destroy stat <<< 11661 1726882372.53341: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select <<< 11661 1726882372.53355: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 11661 1726882372.53377: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 11661 1726882372.53439: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 11661 1726882372.53818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882372.53824: stdout chunk (state=3): >>><<< 11661 1726882372.53830: stderr chunk (state=3): >>><<< 11661 1726882372.54016: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e13f3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e13983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e13f3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e13f3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e1398490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e1398940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e1398670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e134f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e134f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e1372850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e134f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e13b0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e1348d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e1372d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e1398970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12eeeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12f0f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12e7610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12ed640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12ee370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0f91dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0f918b0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0f91eb0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0f91f70> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0f91e80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12c9d30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12c2610> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12d6670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12f5e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0fa3c70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12c9250> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e12d6280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12fb9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0fa3fa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0fa3d90> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0fa3d00> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0f76370> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0f76460> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0fabfa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0fa5a30> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0fa5490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0ec41c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0f61c70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0fa5eb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e12fb040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0ed6af0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0ed6e20> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0ee8730> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0ee8c70> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0e813a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0ed6f10> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0e91280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0ee85b0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0e91340> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0fa39d0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0ead6a0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0ead970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0ead760> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0ead850> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0eadca0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0eb91f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0ead8e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0ea0a30> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0fa35b0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0eada90> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f32e0dd5670> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07ee7c0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e07ee160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07ee280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07eef10> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07ee4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07eed30> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e07eef70> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07ee100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07adee0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e06c70d0> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e06c72b0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e06c7c40> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07d5dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07d53a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07d5f70> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0d48c10> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07f6cd0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07f63a0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07a2b80> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e07f64c0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07f64f0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0725250> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0d5a1f0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e07338e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0d5a370> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0d5aca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0733880> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e07258b0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e07ce190> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0d5a670> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0d538b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e07269d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0744b80> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0731640> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0726f70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0731a30> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e076d7c0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0772820> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e02ba9a0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07ac760> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07f03d0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e07649a0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e014e430> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0775670> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07c0d90> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07f0400> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e02e7ac0> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0298a90> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e0298a00> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e02cf760> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e02e7190> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e003af10> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e003aaf0> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e07d1cd0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e0287160> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e07d12e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32e00a2fa0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e02cbdc0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e003adc0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32e02bf670> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32dffbcf10> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32dffacc10> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32dfff9b20> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32dff354f0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32dff35a30> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_eglx5iyo/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f32dfd7a0d0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32dfd7a910> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32dfd7af40> import 'gc' # # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32dfd7a0a0> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32dff80f10> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32dff285b0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f32dff286a0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "52", "epoch": "1726882372", "epoch_int": "1726882372", "date": "2024-09-20", "time": "21:32:52", "iso8601_micro": "2024-09-21T01:32:52.248352Z", "iso8601": "2024-09-21T01:32:52Z", "iso8601_basic": "20240920T213252248352", "iso8601_basic_short": "20240920T213252", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2801, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 731, "free": 2801}, "nocache": {"free": 3257, "used": 275}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 311, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264240525312, "block_size": 4096, "block_total": 65519355, "block_available": 64511847, "block_used": 1007508, "inode_total": 131071472, "inode_available": 130998721, "inode_used": 72751, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.37, "5m": 0.32, "15m": 0.15}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_service_mgr": "systemd", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 11661 1726882372.55362: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882371.3518252-11704-145882618393862/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882372.55367: _low_level_execute_command(): starting 11661 1726882372.55370: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882371.3518252-11704-145882618393862/ > /dev/null 2>&1 && sleep 0' 11661 1726882372.56079: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882372.56094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882372.56112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882372.56139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882372.56186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882372.56199: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882372.56214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882372.56242: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882372.56259: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882372.56273: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882372.56286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882372.56300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882372.56315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882372.56327: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882372.56345: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882372.56372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882372.56452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882372.56477: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882372.56494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882372.56628: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882372.58768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882372.58772: stdout chunk (state=3): >>><<< 11661 1726882372.58774: stderr chunk (state=3): >>><<< 11661 1726882372.58777: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882372.58779: handler run complete 11661 1726882372.58781: variable 'ansible_facts' from source: unknown 11661 1726882372.58783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882372.60378: variable 'ansible_facts' from source: unknown 11661 1726882372.60464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882372.60596: attempt loop complete, returning result 11661 1726882372.60599: _execute() done 11661 1726882372.60602: dumping result to json 11661 1726882372.60630: done dumping result, returning 11661 1726882372.60643: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0e448fcc-3ce9-896b-2321-0000000000cc] 11661 1726882372.60650: sending task result for task 0e448fcc-3ce9-896b-2321-0000000000cc 11661 1726882372.60979: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000000cc 11661 1726882372.60982: WORKER PROCESS EXITING ok: [managed_node2] 11661 1726882372.61262: no more pending results, returning what we have 11661 1726882372.61267: results queue empty 11661 1726882372.61268: checking for any_errors_fatal 11661 1726882372.61269: done checking for any_errors_fatal 11661 1726882372.61270: checking for max_fail_percentage 11661 1726882372.61271: done checking for max_fail_percentage 11661 1726882372.61272: checking to see if all hosts have failed and the running result is not ok 11661 1726882372.61272: done checking to see if all hosts have failed 11661 1726882372.61274: getting the remaining hosts for this loop 11661 1726882372.61275: done getting the remaining hosts for this loop 11661 1726882372.61278: getting the next task for host managed_node2 11661 1726882372.61285: done getting next task for host managed_node2 11661 1726882372.61287: ^ task is: TASK: meta (flush_handlers) 11661 1726882372.61288: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882372.61292: getting variables 11661 1726882372.61294: in VariableManager get_vars() 11661 1726882372.61315: Calling all_inventory to load vars for managed_node2 11661 1726882372.61318: Calling groups_inventory to load vars for managed_node2 11661 1726882372.61321: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882372.61331: Calling all_plugins_play to load vars for managed_node2 11661 1726882372.61334: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882372.61337: Calling groups_plugins_play to load vars for managed_node2 11661 1726882372.61523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882372.61736: done with get_vars() 11661 1726882372.61746: done getting variables 11661 1726882372.61828: in VariableManager get_vars() 11661 1726882372.61837: Calling all_inventory to load vars for managed_node2 11661 1726882372.61840: Calling groups_inventory to load vars for managed_node2 11661 1726882372.61842: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882372.61847: Calling all_plugins_play to load vars for managed_node2 11661 1726882372.61851: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882372.61860: Calling groups_plugins_play to load vars for managed_node2 11661 1726882372.62912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882372.63124: done with get_vars() 11661 1726882372.63138: done queuing things up, now waiting for results queue to drain 11661 1726882372.63140: results queue empty 11661 1726882372.63141: checking for any_errors_fatal 11661 1726882372.63143: done checking for any_errors_fatal 11661 1726882372.63144: checking for max_fail_percentage 11661 1726882372.63145: done checking for max_fail_percentage 11661 1726882372.63146: checking to see if all hosts have failed and the running result is not ok 11661 1726882372.63146: done checking to see if all hosts have failed 11661 1726882372.63147: getting the remaining hosts for this loop 11661 1726882372.63151: done getting the remaining hosts for this loop 11661 1726882372.63153: getting the next task for host managed_node2 11661 1726882372.63159: done getting next task for host managed_node2 11661 1726882372.63161: ^ task is: TASK: Include the task 'el_repo_setup.yml' 11661 1726882372.63163: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882372.63172: getting variables 11661 1726882372.63173: in VariableManager get_vars() 11661 1726882372.63188: Calling all_inventory to load vars for managed_node2 11661 1726882372.63190: Calling groups_inventory to load vars for managed_node2 11661 1726882372.63193: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882372.63197: Calling all_plugins_play to load vars for managed_node2 11661 1726882372.63199: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882372.63202: Calling groups_plugins_play to load vars for managed_node2 11661 1726882372.63352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882372.63569: done with get_vars() 11661 1726882372.63577: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:11 Friday 20 September 2024 21:32:52 -0400 (0:00:01.338) 0:00:01.350 ****** 11661 1726882372.63673: entering _queue_task() for managed_node2/include_tasks 11661 1726882372.63675: Creating lock for include_tasks 11661 1726882372.63998: worker is 1 (out of 1 available) 11661 1726882372.64011: exiting _queue_task() for managed_node2/include_tasks 11661 1726882372.64023: done queuing things up, now waiting for results queue to drain 11661 1726882372.64025: waiting for pending results... 11661 1726882372.64293: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 11661 1726882372.64392: in run() - task 0e448fcc-3ce9-896b-2321-000000000006 11661 1726882372.64408: variable 'ansible_search_path' from source: unknown 11661 1726882372.64442: calling self._execute() 11661 1726882372.64524: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882372.64535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882372.64547: variable 'omit' from source: magic vars 11661 1726882372.64661: _execute() done 11661 1726882372.64670: dumping result to json 11661 1726882372.64678: done dumping result, returning 11661 1726882372.64694: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [0e448fcc-3ce9-896b-2321-000000000006] 11661 1726882372.64715: sending task result for task 0e448fcc-3ce9-896b-2321-000000000006 11661 1726882372.64863: no more pending results, returning what we have 11661 1726882372.64872: in VariableManager get_vars() 11661 1726882372.64904: Calling all_inventory to load vars for managed_node2 11661 1726882372.64907: Calling groups_inventory to load vars for managed_node2 11661 1726882372.64911: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882372.64924: Calling all_plugins_play to load vars for managed_node2 11661 1726882372.64928: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882372.64931: Calling groups_plugins_play to load vars for managed_node2 11661 1726882372.65156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882372.65361: done with get_vars() 11661 1726882372.65370: variable 'ansible_search_path' from source: unknown 11661 1726882372.65392: we have included files to process 11661 1726882372.65393: generating all_blocks data 11661 1726882372.65395: done generating all_blocks data 11661 1726882372.65396: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11661 1726882372.65397: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11661 1726882372.65400: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11661 1726882372.65907: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000006 11661 1726882372.65910: WORKER PROCESS EXITING 11661 1726882372.66797: in VariableManager get_vars() 11661 1726882372.66925: done with get_vars() 11661 1726882372.66938: done processing included file 11661 1726882372.66940: iterating over new_blocks loaded from include file 11661 1726882372.66942: in VariableManager get_vars() 11661 1726882372.66953: done with get_vars() 11661 1726882372.66955: filtering new block on tags 11661 1726882372.67033: done filtering new block on tags 11661 1726882372.67036: in VariableManager get_vars() 11661 1726882372.67047: done with get_vars() 11661 1726882372.67052: filtering new block on tags 11661 1726882372.67072: done filtering new block on tags 11661 1726882372.67075: in VariableManager get_vars() 11661 1726882372.67085: done with get_vars() 11661 1726882372.67087: filtering new block on tags 11661 1726882372.67099: done filtering new block on tags 11661 1726882372.67101: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 11661 1726882372.67239: extending task lists for all hosts with included blocks 11661 1726882372.67297: done extending task lists 11661 1726882372.67299: done processing included files 11661 1726882372.67300: results queue empty 11661 1726882372.67300: checking for any_errors_fatal 11661 1726882372.67302: done checking for any_errors_fatal 11661 1726882372.67302: checking for max_fail_percentage 11661 1726882372.67303: done checking for max_fail_percentage 11661 1726882372.67304: checking to see if all hosts have failed and the running result is not ok 11661 1726882372.67305: done checking to see if all hosts have failed 11661 1726882372.67306: getting the remaining hosts for this loop 11661 1726882372.67307: done getting the remaining hosts for this loop 11661 1726882372.67309: getting the next task for host managed_node2 11661 1726882372.67313: done getting next task for host managed_node2 11661 1726882372.67315: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 11661 1726882372.67318: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882372.67320: getting variables 11661 1726882372.67321: in VariableManager get_vars() 11661 1726882372.67329: Calling all_inventory to load vars for managed_node2 11661 1726882372.67331: Calling groups_inventory to load vars for managed_node2 11661 1726882372.67333: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882372.67338: Calling all_plugins_play to load vars for managed_node2 11661 1726882372.67341: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882372.67429: Calling groups_plugins_play to load vars for managed_node2 11661 1726882372.67620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882372.67814: done with get_vars() 11661 1726882372.67822: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:32:52 -0400 (0:00:00.045) 0:00:01.395 ****** 11661 1726882372.68215: entering _queue_task() for managed_node2/setup 11661 1726882372.68510: worker is 1 (out of 1 available) 11661 1726882372.68521: exiting _queue_task() for managed_node2/setup 11661 1726882372.68533: done queuing things up, now waiting for results queue to drain 11661 1726882372.68535: waiting for pending results... 11661 1726882372.69534: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 11661 1726882372.69666: in run() - task 0e448fcc-3ce9-896b-2321-0000000000dd 11661 1726882372.69786: variable 'ansible_search_path' from source: unknown 11661 1726882372.69793: variable 'ansible_search_path' from source: unknown 11661 1726882372.69855: calling self._execute() 11661 1726882372.69963: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882372.69978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882372.69995: variable 'omit' from source: magic vars 11661 1726882372.70535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882372.73396: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882372.73496: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882372.73837: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882372.73878: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882372.73910: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882372.74001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882372.74037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882372.74070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882372.74114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882372.74131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882372.74417: variable 'ansible_facts' from source: unknown 11661 1726882372.74539: variable 'network_test_required_facts' from source: task vars 11661 1726882372.74729: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 11661 1726882372.74742: variable 'omit' from source: magic vars 11661 1726882372.74790: variable 'omit' from source: magic vars 11661 1726882372.74908: variable 'omit' from source: magic vars 11661 1726882372.74939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882372.75042: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882372.75070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882372.75091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882372.75134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882372.75170: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882372.75209: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882372.75218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882372.75321: Set connection var ansible_connection to ssh 11661 1726882372.75333: Set connection var ansible_pipelining to False 11661 1726882372.75348: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882372.75362: Set connection var ansible_timeout to 10 11661 1726882372.75372: Set connection var ansible_shell_type to sh 11661 1726882372.75383: Set connection var ansible_shell_executable to /bin/sh 11661 1726882372.75410: variable 'ansible_shell_executable' from source: unknown 11661 1726882372.75417: variable 'ansible_connection' from source: unknown 11661 1726882372.75424: variable 'ansible_module_compression' from source: unknown 11661 1726882372.75431: variable 'ansible_shell_type' from source: unknown 11661 1726882372.75437: variable 'ansible_shell_executable' from source: unknown 11661 1726882372.75444: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882372.75456: variable 'ansible_pipelining' from source: unknown 11661 1726882372.75466: variable 'ansible_timeout' from source: unknown 11661 1726882372.75475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882372.75620: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11661 1726882372.75633: variable 'omit' from source: magic vars 11661 1726882372.75642: starting attempt loop 11661 1726882372.75648: running the handler 11661 1726882372.75675: _low_level_execute_command(): starting 11661 1726882372.75688: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882372.77170: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882372.77186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882372.77201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882372.77219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882372.77261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882372.77277: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882372.77291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882372.77308: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882372.77320: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882372.77330: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882372.77341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882372.77354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882372.77374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882372.77386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882372.77399: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882372.77406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882372.77485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882372.77502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882372.77526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882372.77795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882372.79863: stdout chunk (state=3): >>>/root <<< 11661 1726882372.79900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882372.79904: stdout chunk (state=3): >>><<< 11661 1726882372.79907: stderr chunk (state=3): >>><<< 11661 1726882372.80007: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882372.80010: _low_level_execute_command(): starting 11661 1726882372.80015: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882372.7992702-11772-229695261061640 `" && echo ansible-tmp-1726882372.7992702-11772-229695261061640="` echo /root/.ansible/tmp/ansible-tmp-1726882372.7992702-11772-229695261061640 `" ) && sleep 0' 11661 1726882372.81290: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882372.81307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882372.81330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882372.81356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882372.81403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882372.81452: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882372.81474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882372.81491: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882372.81503: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882372.81512: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882372.81524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882372.81538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882372.81572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882372.81590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882372.81601: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882372.81658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882372.81825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882372.81842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882372.81859: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882372.81996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882372.83889: stdout chunk (state=3): >>>ansible-tmp-1726882372.7992702-11772-229695261061640=/root/.ansible/tmp/ansible-tmp-1726882372.7992702-11772-229695261061640 <<< 11661 1726882372.84094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882372.84097: stdout chunk (state=3): >>><<< 11661 1726882372.84100: stderr chunk (state=3): >>><<< 11661 1726882372.84379: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882372.7992702-11772-229695261061640=/root/.ansible/tmp/ansible-tmp-1726882372.7992702-11772-229695261061640 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882372.84383: variable 'ansible_module_compression' from source: unknown 11661 1726882372.84385: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11661 1726882372.84388: variable 'ansible_facts' from source: unknown 11661 1726882372.84461: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882372.7992702-11772-229695261061640/AnsiballZ_setup.py 11661 1726882372.85048: Sending initial data 11661 1726882372.85054: Sent initial data (154 bytes) 11661 1726882372.87303: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882372.87355: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882372.87412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882372.87435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882372.87486: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882372.87541: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882372.87569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882372.87590: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882372.87636: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882372.87658: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882372.87679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882372.87698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882372.87715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882372.87761: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882372.87776: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882372.87795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882372.87993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882372.88015: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882372.88030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882372.88193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882372.89969: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882372.90068: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882372.90174: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmp4v4t1r6h /root/.ansible/tmp/ansible-tmp-1726882372.7992702-11772-229695261061640/AnsiballZ_setup.py <<< 11661 1726882372.90274: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882372.93587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882372.93721: stderr chunk (state=3): >>><<< 11661 1726882372.93725: stdout chunk (state=3): >>><<< 11661 1726882372.93727: done transferring module to remote 11661 1726882372.93729: _low_level_execute_command(): starting 11661 1726882372.93732: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882372.7992702-11772-229695261061640/ /root/.ansible/tmp/ansible-tmp-1726882372.7992702-11772-229695261061640/AnsiballZ_setup.py && sleep 0' 11661 1726882372.95136: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882372.95180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882372.95196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882372.95215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882372.95376: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882372.95389: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882372.95403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882372.95421: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882372.95433: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882372.95444: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882372.95463: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882372.95485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882372.95501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882372.95513: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882372.95524: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882372.95537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882372.95617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882372.95695: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882372.95710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882372.95909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882372.97786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882372.97790: stdout chunk (state=3): >>><<< 11661 1726882372.97793: stderr chunk (state=3): >>><<< 11661 1726882372.97897: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882372.97902: _low_level_execute_command(): starting 11661 1726882372.97905: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882372.7992702-11772-229695261061640/AnsiballZ_setup.py && sleep 0' 11661 1726882372.99409: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882372.99545: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882372.99562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882372.99585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882372.99631: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882372.99666: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882372.99763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882372.99783: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882372.99793: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882372.99801: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882372.99811: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882372.99821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882372.99834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882372.99843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882372.99859: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882372.99876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882372.99954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882373.00087: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882373.00101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882373.00315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882373.02304: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 11661 1726882373.02308: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 11661 1726882373.02371: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 11661 1726882373.02406: stdout chunk (state=3): >>>import 'posix' # <<< 11661 1726882373.02439: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 11661 1726882373.02442: stdout chunk (state=3): >>># installing zipimport hook <<< 11661 1726882373.02475: stdout chunk (state=3): >>>import 'time' # <<< 11661 1726882373.02496: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 11661 1726882373.02540: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py <<< 11661 1726882373.02544: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882373.02557: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 11661 1726882373.02582: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 11661 1726882373.02591: stdout chunk (state=3): >>>import '_codecs' # <<< 11661 1726882373.02612: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45be43dc0> <<< 11661 1726882373.02641: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 11661 1726882373.02670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bbd83a0> <<< 11661 1726882373.02676: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45be43b20> <<< 11661 1726882373.02692: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 11661 1726882373.02705: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45be43ac0> <<< 11661 1726882373.02730: stdout chunk (state=3): >>>import '_signal' # <<< 11661 1726882373.02754: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py <<< 11661 1726882373.02757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 11661 1726882373.02775: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bbd8490> <<< 11661 1726882373.02789: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 11661 1726882373.02837: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc'<<< 11661 1726882373.02844: stdout chunk (state=3): >>> <<< 11661 1726882373.02847: stdout chunk (state=3): >>>import '_abc' # <<< 11661 1726882373.02864: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bbd8940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bbd8670> <<< 11661 1726882373.02898: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 11661 1726882373.02901: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 11661 1726882373.02927: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 11661 1726882373.02950: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 11661 1726882373.02966: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 11661 1726882373.02982: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 11661 1726882373.03001: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb8f190> <<< 11661 1726882373.03023: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 11661 1726882373.03043: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 11661 1726882373.03131: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb8f220> <<< 11661 1726882373.03155: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 11661 1726882373.03164: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 11661 1726882373.03189: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bbb2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb8f940> <<< 11661 1726882373.03223: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bbf0880> <<< 11661 1726882373.03248: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 11661 1726882373.03251: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb88d90> <<< 11661 1726882373.03315: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # <<< 11661 1726882373.03319: stdout chunk (state=3): >>>import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bbb2d90> <<< 11661 1726882373.03367: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bbd8970> <<< 11661 1726882373.03403: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11661 1726882373.03750: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 11661 1726882373.03754: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 11661 1726882373.03783: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 11661 1726882373.03786: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 11661 1726882373.03802: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 11661 1726882373.03816: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 11661 1726882373.03834: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 11661 1726882373.03860: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 11661 1726882373.03862: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb2eeb0> <<< 11661 1726882373.03907: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb31f40> <<< 11661 1726882373.03935: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 11661 1726882373.03939: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 11661 1726882373.03954: stdout chunk (state=3): >>>import '_sre' # <<< 11661 1726882373.03994: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 11661 1726882373.03998: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 11661 1726882373.04014: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 11661 1726882373.04035: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb27610> <<< 11661 1726882373.04060: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb2d640> <<< 11661 1726882373.04067: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb2e370> <<< 11661 1726882373.04078: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 11661 1726882373.04160: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 11661 1726882373.04174: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 11661 1726882373.04231: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882373.04234: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 11661 1726882373.04242: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 11661 1726882373.04268: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45ba13e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba13910> <<< 11661 1726882373.04287: stdout chunk (state=3): >>>import 'itertools' # <<< 11661 1726882373.04309: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py <<< 11661 1726882373.04315: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba13f10> <<< 11661 1726882373.04331: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 11661 1726882373.04355: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 11661 1726882373.04374: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba13fd0> <<< 11661 1726882373.04395: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 11661 1726882373.04415: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba260d0> <<< 11661 1726882373.04418: stdout chunk (state=3): >>>import '_collections' # <<< 11661 1726882373.04475: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb09d90> <<< 11661 1726882373.04482: stdout chunk (state=3): >>>import '_functools' # <<< 11661 1726882373.04501: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb02670> <<< 11661 1726882373.04560: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 11661 1726882373.04592: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb156d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb35e20> <<< 11661 1726882373.04598: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 11661 1726882373.04620: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45ba26cd0> <<< 11661 1726882373.04623: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb092b0> <<< 11661 1726882373.04669: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 11661 1726882373.04694: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45bb152e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb3b9d0> <<< 11661 1726882373.04697: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 11661 1726882373.04725: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882373.04756: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 11661 1726882373.04761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 11661 1726882373.04787: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba26eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba26df0> <<< 11661 1726882373.04803: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba26d60> <<< 11661 1726882373.04824: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 11661 1726882373.04837: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 11661 1726882373.04853: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 11661 1726882373.04868: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 11661 1726882373.04884: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 11661 1726882373.04935: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 11661 1726882373.04957: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' <<< 11661 1726882373.04976: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b9f93d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 11661 1726882373.04993: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 11661 1726882373.05024: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b9f94c0> <<< 11661 1726882373.05597: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba2df40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba28a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba28490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b922220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b9e4520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba28f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb3b040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b934b50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b934e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b945790> <<< 11661 1726882373.05610: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 11661 1726882373.05706: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b945cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b8d3400> <<< 11661 1726882373.05725: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b934f70> <<< 11661 1726882373.05739: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 11661 1726882373.05751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 11661 1726882373.05816: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b8e42e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b945610> import 'pwd' # <<< 11661 1726882373.05834: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' <<< 11661 1726882373.05845: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b8e43a0> <<< 11661 1726882373.05927: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba26a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 11661 1726882373.05940: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 11661 1726882373.06033: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b8ff700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b8ff9d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b8ff7c0> <<< 11661 1726882373.06057: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b8ff8b0> <<< 11661 1726882373.06086: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 11661 1726882373.06284: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' <<< 11661 1726882373.06296: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b8ffd00> <<< 11661 1726882373.06317: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 11661 1726882373.06333: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b90a250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b8ff940> <<< 11661 1726882373.06354: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b8f3a90> <<< 11661 1726882373.06371: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba26610> <<< 11661 1726882373.06385: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 11661 1726882373.06471: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 11661 1726882373.06486: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b8ffaf0> <<< 11661 1726882373.06631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 11661 1726882373.06648: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc45b82e6d0> <<< 11661 1726882373.06871: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip' <<< 11661 1726882373.06882: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.06972: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.06995: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/__init__.py <<< 11661 1726882373.07020: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11661 1726882373.07039: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/__init__.py <<< 11661 1726882373.07054: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.08276: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.09214: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b76a820> <<< 11661 1726882373.09234: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882373.09250: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 11661 1726882373.09278: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 11661 1726882373.09306: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b76a160> <<< 11661 1726882373.09353: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b76a280> <<< 11661 1726882373.09386: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b76af70> <<< 11661 1726882373.09401: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 11661 1726882373.09461: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b76a4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b76ad90> <<< 11661 1726882373.09470: stdout chunk (state=3): >>>import 'atexit' # <<< 11661 1726882373.09493: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 11661 1726882373.09497: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b76afd0> <<< 11661 1726882373.09508: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 11661 1726882373.09544: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 11661 1726882373.09579: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b76a100> <<< 11661 1726882373.09596: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 11661 1726882373.09620: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 11661 1726882373.09637: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 11661 1726882373.09670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 11661 1726882373.09682: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 11661 1726882373.09769: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b7410d0> <<< 11661 1726882373.09805: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b108310> <<< 11661 1726882373.09862: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b108160> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 11661 1726882373.09867: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 11661 1726882373.09897: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b108ca0> <<< 11661 1726882373.09986: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b751dc0> <<< 11661 1726882373.10279: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b7513a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b751fd0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b7a1d30> <<< 11661 1726882373.10388: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b74cd30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b74c400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b1e1b20> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b74c520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b74c550> <<< 11661 1726882373.10425: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 11661 1726882373.10438: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 11661 1726882373.10476: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 11661 1726882373.10488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 11661 1726882373.10598: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b173fd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b7b3250> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 11661 1726882373.10602: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 11661 1726882373.10648: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b170850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b7b33d0> <<< 11661 1726882373.10690: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 11661 1726882373.10711: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882373.10751: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 11661 1726882373.10755: stdout chunk (state=3): >>>import '_string' # <<< 11661 1726882373.10802: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b7b3ca0> <<< 11661 1726882373.10942: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b1707f0> <<< 11661 1726882373.11038: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b74bc10> <<< 11661 1726882373.11068: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b7b3fa0> <<< 11661 1726882373.11164: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b7b3550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b7ab910> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 11661 1726882373.11179: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 11661 1726882373.11223: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b166940> <<< 11661 1726882373.11439: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b183d90> <<< 11661 1726882373.11458: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b16f580> <<< 11661 1726882373.11502: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b166ee0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b16f9a0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 11661 1726882373.11506: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.11591: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.11680: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.11713: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py <<< 11661 1726882373.11716: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 11661 1726882373.11822: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.11915: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.12386: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.12870: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py <<< 11661 1726882373.12874: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 11661 1726882373.12895: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882373.12956: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b1827f0> <<< 11661 1726882373.13035: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 11661 1726882373.13053: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b1bd8b0> <<< 11661 1726882373.13057: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ad17970> <<< 11661 1726882373.13092: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 11661 1726882373.13129: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.13143: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.13147: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 11661 1726882373.13267: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.13403: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 11661 1726882373.13424: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b1e9730> # zipimport: zlib available <<< 11661 1726882373.13820: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.14187: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.14239: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.14307: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 11661 1726882373.14346: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.14380: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py <<< 11661 1726882373.14395: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.14443: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.14526: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/errors.py <<< 11661 1726882373.14559: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 11661 1726882373.14562: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.14587: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.14632: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 11661 1726882373.14635: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.14820: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.15009: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 11661 1726882373.15044: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 11661 1726882373.15122: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b76d370> # zipimport: zlib available <<< 11661 1726882373.15184: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.15258: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py <<< 11661 1726882373.15282: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available <<< 11661 1726882373.15319: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.15365: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/locale.py <<< 11661 1726882373.15368: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.15397: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.15438: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.15539: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.15589: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 11661 1726882373.15612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882373.15688: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b1a0550> <<< 11661 1726882373.15789: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45aba8160> <<< 11661 1726882373.15837: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 11661 1726882373.15890: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.15949: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.15973: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.16010: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 11661 1726882373.16036: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 11661 1726882373.16041: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 11661 1726882373.16078: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 11661 1726882373.16090: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 11661 1726882373.16114: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 11661 1726882373.16193: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b1a3910> <<< 11661 1726882373.16238: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b1a4790> <<< 11661 1726882373.16298: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b1a0b50> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 11661 1726882373.16348: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.16361: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 11661 1726882373.16433: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/basic.py <<< 11661 1726882373.16470: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/modules/__init__.py <<< 11661 1726882373.16474: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.16523: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.16582: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.16613: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.16616: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.16649: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.16683: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.16714: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.16756: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 11661 1726882373.16759: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.16821: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.16892: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.16905: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.16941: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 11661 1726882373.17095: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.17224: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.17263: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.17305: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882373.17331: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 11661 1726882373.17358: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py <<< 11661 1726882373.17376: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 11661 1726882373.17405: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45acda370> <<< 11661 1726882373.17423: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 11661 1726882373.17448: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 11661 1726882373.17472: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 11661 1726882373.17509: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 11661 1726882373.17525: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45acf6580> <<< 11661 1726882373.17556: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45acf64f0> <<< 11661 1726882373.17638: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45accb280> <<< 11661 1726882373.17677: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45acda970> <<< 11661 1726882373.17699: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45aa927f0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45aa92b20> <<< 11661 1726882373.17715: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 11661 1726882373.17738: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 11661 1726882373.17781: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45ad37f70> <<< 11661 1726882373.17793: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ace20a0> <<< 11661 1726882373.17811: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 11661 1726882373.17846: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ad37e80> <<< 11661 1726882373.17872: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 11661 1726882373.17898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 11661 1726882373.17911: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45aafbfd0> <<< 11661 1726882373.17951: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ad26820> <<< 11661 1726882373.17985: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45aa92d60> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py <<< 11661 1726882373.18012: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 11661 1726882373.18025: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.18076: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.18126: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available <<< 11661 1726882373.18177: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.18220: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 11661 1726882373.18235: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11661 1726882373.18261: stdout chunk (state=3): >>>import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available <<< 11661 1726882373.18295: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.18332: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 11661 1726882373.18335: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.18374: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.18428: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 11661 1726882373.18431: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.18455: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.18506: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 11661 1726882373.18509: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.18559: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.18604: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.18659: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.18715: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 11661 1726882373.18718: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.19124: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.19494: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 11661 1726882373.19539: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.19585: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.19615: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.19649: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available <<< 11661 1726882373.19680: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.19715: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 11661 1726882373.19720: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.19758: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.19811: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available <<< 11661 1726882373.19859: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.19875: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 11661 1726882373.19894: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.19940: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available <<< 11661 1726882373.19992: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.20088: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 11661 1726882373.20111: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45a9dfe80> <<< 11661 1726882373.20131: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 11661 1726882373.20144: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 11661 1726882373.20315: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45a9df9d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 11661 1726882373.20318: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.20357: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.20423: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available <<< 11661 1726882373.20499: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.20586: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 11661 1726882373.20589: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.20637: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.20702: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available <<< 11661 1726882373.20743: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.20780: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 11661 1726882373.20804: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 11661 1726882373.20951: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45aa5c550> <<< 11661 1726882373.21196: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45a9e3850> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available <<< 11661 1726882373.21246: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.21299: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 11661 1726882373.21302: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.21368: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.21436: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.21528: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.21684: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 11661 1726882373.21688: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.21700: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.21741: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 11661 1726882373.21744: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.21780: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.21821: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 11661 1726882373.21899: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45aa59670> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45aa59220> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 11661 1726882373.21919: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11661 1726882373.21923: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available <<< 11661 1726882373.21946: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.21993: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 11661 1726882373.21996: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.22126: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.22251: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 11661 1726882373.22334: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.22409: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.22448: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.22493: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 11661 1726882373.22496: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.22590: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.22604: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.22719: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.22839: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 11661 1726882373.22948: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.23057: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 11661 1726882373.23070: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.23081: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.23108: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.23535: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.23956: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 11661 1726882373.23959: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.24038: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.24135: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 11661 1726882373.24215: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.24307: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 11661 1726882373.24310: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.24425: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.24576: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available <<< 11661 1726882373.24591: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available <<< 11661 1726882373.24621: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.24667: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available <<< 11661 1726882373.24753: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.24832: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.25003: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.25179: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py <<< 11661 1726882373.25182: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available <<< 11661 1726882373.25207: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.25256: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 11661 1726882373.25259: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.25296: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.25299: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 11661 1726882373.25372: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.25430: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 11661 1726882373.25434: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.25475: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.25478: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 11661 1726882373.25521: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.25578: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available <<< 11661 1726882373.25620: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.25676: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available <<< 11661 1726882373.25891: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.26111: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 11661 1726882373.26116: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.26153: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.26208: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 11661 1726882373.26247: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.26278: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available <<< 11661 1726882373.26302: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.26337: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available <<< 11661 1726882373.26393: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.26408: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available <<< 11661 1726882373.26471: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.26570: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available <<< 11661 1726882373.26591: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 11661 1726882373.26604: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.26617: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.26670: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available <<< 11661 1726882373.26700: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.26703: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.26737: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.26778: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.26841: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.26907: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py <<< 11661 1726882373.26927: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 11661 1726882373.26930: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.26954: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.27005: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 11661 1726882373.27187: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.27338: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 11661 1726882373.27341: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.27372: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.27423: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 11661 1726882373.27427: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.27461: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.27512: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 11661 1726882373.27515: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.27575: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.27662: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 11661 1726882373.27667: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.27725: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.27825: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py <<< 11661 1726882373.27828: stdout chunk (state=3): >>>import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 11661 1726882373.27885: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.28045: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 11661 1726882373.28075: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 11661 1726882373.28078: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 11661 1726882373.28118: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45a99d100> <<< 11661 1726882373.28121: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45a9f1df0> <<< 11661 1726882373.28171: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45a9f1b50> <<< 11661 1726882373.30083: stdout chunk (state=3): >>>import 'gc' # <<< 11661 1726882373.30557: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "53", "epoch": "1726882373", "epoch_int": "1726882373", "date": "2024-09-20", "time": "21:32:53", "iso8601_micro": "2024-09-21T01:32:53.301510Z", "iso8601": "2024-09-21T01:32:53Z", "iso8601_basic": "20240920T213253301510", "iso8601_basic_short": "20240920T213253", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11661 1726882373.31139: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 11661 1726882373.31305: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime <<< 11661 1726882373.31372: stdout chunk (state=3): >>># cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool <<< 11661 1726882373.31488: stdout chunk (state=3): >>># destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user <<< 11661 1726882373.31580: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly <<< 11661 1726882373.31606: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc <<< 11661 1726882373.31875: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11661 1726882373.31895: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 11661 1726882373.31963: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 11661 1726882373.31985: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 11661 1726882373.32002: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 11661 1726882373.32005: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 11661 1726882373.32045: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 11661 1726882373.32098: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 11661 1726882373.32136: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction <<< 11661 1726882373.32178: stdout chunk (state=3): >>># destroy shlex # destroy datetime <<< 11661 1726882373.32214: stdout chunk (state=3): >>># destroy base64 <<< 11661 1726882373.32228: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 11661 1726882373.32288: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep <<< 11661 1726882373.32354: stdout chunk (state=3): >>># cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select <<< 11661 1726882373.32410: stdout chunk (state=3): >>># cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools <<< 11661 1726882373.32498: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings <<< 11661 1726882373.32517: stdout chunk (state=3): >>># cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 11661 1726882373.32531: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 11661 1726882373.32725: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 11661 1726882373.32751: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat <<< 11661 1726882373.32783: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 11661 1726882373.32796: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 11661 1726882373.32823: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 11661 1726882373.33192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882373.33195: stdout chunk (state=3): >>><<< 11661 1726882373.33198: stderr chunk (state=3): >>><<< 11661 1726882373.33381: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45be43dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bbd83a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45be43b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45be43ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bbd8490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bbd8940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bbd8670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb8f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb8f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bbb2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb8f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bbf0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb88d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bbb2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bbd8970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb2eeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb31f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb27610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb2d640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb2e370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45ba13e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba13910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba13f10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba13fd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba260d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb09d90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb02670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb156d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb35e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45ba26cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb092b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45bb152e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb3b9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba26eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba26df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba26d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b9f93d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b9f94c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba2df40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba28a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba28490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b922220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b9e4520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba28f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45bb3b040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b934b50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b934e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b945790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b945cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b8d3400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b934f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b8e42e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b945610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b8e43a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba26a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b8ff700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b8ff9d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b8ff7c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b8ff8b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b8ffd00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b90a250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b8ff940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b8f3a90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ba26610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b8ffaf0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc45b82e6d0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b76a820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b76a160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b76a280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b76af70> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b76a4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b76ad90> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b76afd0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b76a100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b7410d0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b108310> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b108160> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b108ca0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b751dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b7513a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b751fd0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b7a1d30> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b74cd30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b74c400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b1e1b20> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b74c520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b74c550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b173fd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b7b3250> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b170850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b7b33d0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b7b3ca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b1707f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b74bc10> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b7b3fa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b7b3550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b7ab910> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b166940> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b183d90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b16f580> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b166ee0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b16f9a0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b1827f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b1bd8b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ad17970> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b1e9730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b76d370> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45b1a0550> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45aba8160> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b1a3910> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b1a4790> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45b1a0b50> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45acda370> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45acf6580> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45acf64f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45accb280> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45acda970> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45aa927f0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45aa92b20> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45ad37f70> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ace20a0> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ad37e80> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45aafbfd0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45ad26820> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45aa92d60> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45a9dfe80> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45a9df9d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45aa5c550> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45a9e3850> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45aa59670> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45aa59220> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_2shvglo6/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc45a99d100> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45a9f1df0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc45a9f1b50> import 'gc' # {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "53", "epoch": "1726882373", "epoch_int": "1726882373", "date": "2024-09-20", "time": "21:32:53", "iso8601_micro": "2024-09-21T01:32:53.301510Z", "iso8601": "2024-09-21T01:32:53Z", "iso8601_basic": "20240920T213253301510", "iso8601_basic_short": "20240920T213253", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 11661 1726882373.34331: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882372.7992702-11772-229695261061640/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882373.34334: _low_level_execute_command(): starting 11661 1726882373.34336: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882372.7992702-11772-229695261061640/ > /dev/null 2>&1 && sleep 0' 11661 1726882373.34866: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882373.34888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882373.34893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882373.34896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882373.34898: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882373.34901: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882373.34903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882373.34905: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882373.34906: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882373.34911: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882373.34913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882373.34914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882373.34916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882373.34918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882373.34919: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882373.34921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882373.34923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882373.34924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882373.34926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882373.35082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882373.36895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882373.36899: stdout chunk (state=3): >>><<< 11661 1726882373.36906: stderr chunk (state=3): >>><<< 11661 1726882373.36936: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882373.36939: handler run complete 11661 1726882373.37005: variable 'ansible_facts' from source: unknown 11661 1726882373.37060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882373.37176: variable 'ansible_facts' from source: unknown 11661 1726882373.37215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882373.37355: attempt loop complete, returning result 11661 1726882373.37358: _execute() done 11661 1726882373.37361: dumping result to json 11661 1726882373.37374: done dumping result, returning 11661 1726882373.37383: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [0e448fcc-3ce9-896b-2321-0000000000dd] 11661 1726882373.37386: sending task result for task 0e448fcc-3ce9-896b-2321-0000000000dd 11661 1726882373.37553: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000000dd 11661 1726882373.37556: WORKER PROCESS EXITING ok: [managed_node2] 11661 1726882373.37656: no more pending results, returning what we have 11661 1726882373.37659: results queue empty 11661 1726882373.37660: checking for any_errors_fatal 11661 1726882373.37661: done checking for any_errors_fatal 11661 1726882373.37662: checking for max_fail_percentage 11661 1726882373.37665: done checking for max_fail_percentage 11661 1726882373.37665: checking to see if all hosts have failed and the running result is not ok 11661 1726882373.37666: done checking to see if all hosts have failed 11661 1726882373.37667: getting the remaining hosts for this loop 11661 1726882373.37669: done getting the remaining hosts for this loop 11661 1726882373.37671: getting the next task for host managed_node2 11661 1726882373.37679: done getting next task for host managed_node2 11661 1726882373.37681: ^ task is: TASK: Check if system is ostree 11661 1726882373.37683: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882373.37686: getting variables 11661 1726882373.37688: in VariableManager get_vars() 11661 1726882373.37711: Calling all_inventory to load vars for managed_node2 11661 1726882373.37713: Calling groups_inventory to load vars for managed_node2 11661 1726882373.37716: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882373.37726: Calling all_plugins_play to load vars for managed_node2 11661 1726882373.37728: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882373.37731: Calling groups_plugins_play to load vars for managed_node2 11661 1726882373.37915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882373.38122: done with get_vars() 11661 1726882373.38132: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:32:53 -0400 (0:00:00.700) 0:00:02.096 ****** 11661 1726882373.38316: entering _queue_task() for managed_node2/stat 11661 1726882373.39081: worker is 1 (out of 1 available) 11661 1726882373.39094: exiting _queue_task() for managed_node2/stat 11661 1726882373.39106: done queuing things up, now waiting for results queue to drain 11661 1726882373.39108: waiting for pending results... 11661 1726882373.40388: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 11661 1726882373.40597: in run() - task 0e448fcc-3ce9-896b-2321-0000000000df 11661 1726882373.40627: variable 'ansible_search_path' from source: unknown 11661 1726882373.40639: variable 'ansible_search_path' from source: unknown 11661 1726882373.40681: calling self._execute() 11661 1726882373.40851: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882373.40861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882373.40876: variable 'omit' from source: magic vars 11661 1726882373.41849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11661 1726882373.42501: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11661 1726882373.43227: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11661 1726882373.43344: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11661 1726882373.43383: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11661 1726882373.43502: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11661 1726882373.43661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11661 1726882373.43695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882373.43773: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11661 1726882373.44096: Evaluated conditional (not __network_is_ostree is defined): True 11661 1726882373.44109: variable 'omit' from source: magic vars 11661 1726882373.44153: variable 'omit' from source: magic vars 11661 1726882373.44223: variable 'omit' from source: magic vars 11661 1726882373.44325: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882373.44422: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882373.44446: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882373.44469: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882373.44484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882373.44534: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882373.44543: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882373.44550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882373.44662: Set connection var ansible_connection to ssh 11661 1726882373.44679: Set connection var ansible_pipelining to False 11661 1726882373.44691: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882373.44703: Set connection var ansible_timeout to 10 11661 1726882373.44710: Set connection var ansible_shell_type to sh 11661 1726882373.44729: Set connection var ansible_shell_executable to /bin/sh 11661 1726882373.44756: variable 'ansible_shell_executable' from source: unknown 11661 1726882373.44771: variable 'ansible_connection' from source: unknown 11661 1726882373.44784: variable 'ansible_module_compression' from source: unknown 11661 1726882373.44790: variable 'ansible_shell_type' from source: unknown 11661 1726882373.44796: variable 'ansible_shell_executable' from source: unknown 11661 1726882373.44801: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882373.44808: variable 'ansible_pipelining' from source: unknown 11661 1726882373.44813: variable 'ansible_timeout' from source: unknown 11661 1726882373.44820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882373.44983: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11661 1726882373.45002: variable 'omit' from source: magic vars 11661 1726882373.45011: starting attempt loop 11661 1726882373.45016: running the handler 11661 1726882373.45031: _low_level_execute_command(): starting 11661 1726882373.45044: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882373.45849: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882373.45869: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882373.45883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882373.45898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882373.45945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882373.45957: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882373.45972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882373.45992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882373.46002: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882373.46011: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882373.46024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882373.46039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882373.46052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882373.46062: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882373.46073: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882373.46085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882373.46166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882373.46187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882373.46204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882373.46337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882373.48001: stdout chunk (state=3): >>>/root <<< 11661 1726882373.48171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882373.48198: stdout chunk (state=3): >>><<< 11661 1726882373.48203: stderr chunk (state=3): >>><<< 11661 1726882373.48313: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882373.48324: _low_level_execute_command(): starting 11661 1726882373.48328: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882373.4822066-11815-212166598929980 `" && echo ansible-tmp-1726882373.4822066-11815-212166598929980="` echo /root/.ansible/tmp/ansible-tmp-1726882373.4822066-11815-212166598929980 `" ) && sleep 0' 11661 1726882373.50027: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882373.50031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882373.50077: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882373.50080: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882373.50083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 11661 1726882373.50085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882373.50143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882373.50146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882373.50149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882373.50270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882373.52143: stdout chunk (state=3): >>>ansible-tmp-1726882373.4822066-11815-212166598929980=/root/.ansible/tmp/ansible-tmp-1726882373.4822066-11815-212166598929980 <<< 11661 1726882373.52254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882373.52329: stderr chunk (state=3): >>><<< 11661 1726882373.52332: stdout chunk (state=3): >>><<< 11661 1726882373.52673: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882373.4822066-11815-212166598929980=/root/.ansible/tmp/ansible-tmp-1726882373.4822066-11815-212166598929980 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882373.52771: variable 'ansible_module_compression' from source: unknown 11661 1726882373.52973: ANSIBALLZ: Using lock for stat 11661 1726882373.52977: ANSIBALLZ: Acquiring lock 11661 1726882373.52979: ANSIBALLZ: Lock acquired: 139652576276752 11661 1726882373.52981: ANSIBALLZ: Creating module 11661 1726882373.68905: ANSIBALLZ: Writing module into payload 11661 1726882373.69029: ANSIBALLZ: Writing module 11661 1726882373.69066: ANSIBALLZ: Renaming module 11661 1726882373.69077: ANSIBALLZ: Done creating module 11661 1726882373.69096: variable 'ansible_facts' from source: unknown 11661 1726882373.69167: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882373.4822066-11815-212166598929980/AnsiballZ_stat.py 11661 1726882373.69315: Sending initial data 11661 1726882373.69318: Sent initial data (153 bytes) 11661 1726882373.70321: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882373.70343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882373.70365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882373.70394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882373.70448: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882373.70462: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882373.70489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882373.70507: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882373.70526: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882373.70546: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882373.70569: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882373.70602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882373.70631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882373.70642: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882373.70652: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882373.70669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882373.70755: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882373.70780: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882373.70800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882373.70979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882373.72806: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882373.72897: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882373.73004: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmp80z4tigx /root/.ansible/tmp/ansible-tmp-1726882373.4822066-11815-212166598929980/AnsiballZ_stat.py <<< 11661 1726882373.73099: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882373.74406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882373.74662: stderr chunk (state=3): >>><<< 11661 1726882373.74668: stdout chunk (state=3): >>><<< 11661 1726882373.74670: done transferring module to remote 11661 1726882373.74672: _low_level_execute_command(): starting 11661 1726882373.74675: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882373.4822066-11815-212166598929980/ /root/.ansible/tmp/ansible-tmp-1726882373.4822066-11815-212166598929980/AnsiballZ_stat.py && sleep 0' 11661 1726882373.75246: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882373.75262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882373.75281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882373.75302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882373.75345: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882373.75358: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882373.75379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882373.75397: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882373.75408: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882373.75420: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882373.75433: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882373.75447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882373.75467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882373.75481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882373.75492: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882373.75505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882373.75582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882373.75598: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882373.75613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882373.75775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882373.77542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882373.77616: stderr chunk (state=3): >>><<< 11661 1726882373.77620: stdout chunk (state=3): >>><<< 11661 1726882373.77714: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882373.77717: _low_level_execute_command(): starting 11661 1726882373.77720: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882373.4822066-11815-212166598929980/AnsiballZ_stat.py && sleep 0' 11661 1726882373.78296: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882373.78309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882373.78324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882373.78342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882373.78390: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882373.78403: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882373.78417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882373.78434: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882373.78446: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882373.78461: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882373.78477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882373.78491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882373.78507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882373.78520: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882373.78531: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882373.78544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882373.78624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882373.78644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882373.78660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882373.79026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882373.80984: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 11661 1726882373.80988: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 11661 1726882373.81047: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 11661 1726882373.81081: stdout chunk (state=3): >>>import 'posix' # <<< 11661 1726882373.81111: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 11661 1726882373.81162: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 11661 1726882373.81168: stdout chunk (state=3): >>># installed zipimport hook <<< 11661 1726882373.81206: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882373.81223: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 11661 1726882373.81252: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 11661 1726882373.81280: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d373dc0> <<< 11661 1726882373.81309: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 11661 1726882373.81333: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d3183a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d373b20> <<< 11661 1726882373.81368: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py <<< 11661 1726882373.81396: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 11661 1726882373.81406: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d373ac0> import '_signal' # <<< 11661 1726882373.81422: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d318490> <<< 11661 1726882373.81450: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 11661 1726882373.81485: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 11661 1726882373.81496: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d318940> <<< 11661 1726882373.81515: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d318670> <<< 11661 1726882373.81547: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 11661 1726882373.81576: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 11661 1726882373.81589: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 11661 1726882373.81623: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 11661 1726882373.81626: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 11661 1726882373.81660: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 11661 1726882373.81684: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d2cf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 11661 1726882373.81697: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 11661 1726882373.81780: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d2cf220> <<< 11661 1726882373.81794: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 11661 1726882373.81833: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d2f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d2cf940> <<< 11661 1726882373.81876: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d330880> <<< 11661 1726882373.81891: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d2c8d90> <<< 11661 1726882373.81958: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d2f2d90> <<< 11661 1726882373.82002: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d318970> <<< 11661 1726882373.82030: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11661 1726882373.82230: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 11661 1726882373.82276: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc'<<< 11661 1726882373.82280: stdout chunk (state=3): >>> <<< 11661 1726882373.82282: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 11661 1726882373.82313: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 11661 1726882373.82327: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 11661 1726882373.82339: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 11661 1726882373.82344: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d26eeb0> <<< 11661 1726882373.82401: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d270f40> <<< 11661 1726882373.82406: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 11661 1726882373.82427: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 11661 1726882373.82431: stdout chunk (state=3): >>>import '_sre' # <<< 11661 1726882373.82464: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 11661 1726882373.82482: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 11661 1726882373.82492: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 11661 1726882373.82522: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d267610> <<< 11661 1726882373.82535: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d26d640> <<< 11661 1726882373.82551: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d26e370> <<< 11661 1726882373.82554: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 11661 1726882373.82643: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 11661 1726882373.82652: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 11661 1726882373.82693: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882373.82697: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 11661 1726882373.82700: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 11661 1726882373.82770: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 11661 1726882373.82775: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cfd4e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfd4910> <<< 11661 1726882373.82777: stdout chunk (state=3): >>>import 'itertools' # <<< 11661 1726882373.82793: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py <<< 11661 1726882373.82797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfd4f10> <<< 11661 1726882373.82800: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 11661 1726882373.82815: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 11661 1726882373.82839: stdout chunk (state=3): >>>import '_operator' # <<< 11661 1726882373.82873: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfd4fd0> <<< 11661 1726882373.82877: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 11661 1726882373.82880: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfe70d0> <<< 11661 1726882373.82882: stdout chunk (state=3): >>>import '_collections' # <<< 11661 1726882373.83195: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d249d90> <<< 11661 1726882373.83198: stdout chunk (state=3): >>>import '_functools' # <<< 11661 1726882373.83251: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d242670> <<< 11661 1726882373.83254: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 11661 1726882373.83290: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d2546d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d275e20> <<< 11661 1726882373.83302: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 11661 1726882373.83305: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' <<< 11661 1726882373.83307: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cfe7cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d2492b0> <<< 11661 1726882373.83309: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 11661 1726882373.83311: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36d2542e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d27b9d0> <<< 11661 1726882373.83313: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 11661 1726882373.83377: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 11661 1726882373.83381: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882373.83383: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 11661 1726882373.83384: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 11661 1726882373.83386: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfe7eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfe7df0> <<< 11661 1726882373.83395: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' <<< 11661 1726882373.83398: stdout chunk (state=3): >>>import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfe7d60> <<< 11661 1726882373.83400: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 11661 1726882373.83402: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 11661 1726882373.83404: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 11661 1726882373.83405: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 11661 1726882373.83418: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 11661 1726882373.83447: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 11661 1726882373.83455: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' <<< 11661 1726882373.83474: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfba3d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 11661 1726882373.83477: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 11661 1726882373.83479: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfba4c0> <<< 11661 1726882373.83600: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfeef40> <<< 11661 1726882373.83644: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfe9a90> <<< 11661 1726882373.83647: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfe9490> <<< 11661 1726882373.83677: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 11661 1726882373.83688: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 11661 1726882373.83715: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 11661 1726882373.83740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cee3220> <<< 11661 1726882373.83778: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfa5520> <<< 11661 1726882373.83838: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfe9f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d27b040> <<< 11661 1726882373.83850: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 11661 1726882373.83888: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 11661 1726882373.83914: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cef5b50> import 'errno' # <<< 11661 1726882373.83958: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cef5e80> <<< 11661 1726882373.83988: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 11661 1726882373.84002: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cf06790> <<< 11661 1726882373.84028: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 11661 1726882373.84053: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 11661 1726882373.84080: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cf06cd0> <<< 11661 1726882373.84120: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36ce94400> <<< 11661 1726882373.84138: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cef5f70> <<< 11661 1726882373.84153: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 11661 1726882373.84207: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cea52e0> <<< 11661 1726882373.84211: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cf06610> import 'pwd' # <<< 11661 1726882373.84237: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cea53a0> <<< 11661 1726882373.84279: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfe7a30> <<< 11661 1726882373.84284: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 11661 1726882373.84313: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 11661 1726882373.84332: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 11661 1726882373.84338: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 11661 1726882373.84372: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' <<< 11661 1726882373.84386: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cec0700> <<< 11661 1726882373.84389: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 11661 1726882373.84415: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' <<< 11661 1726882373.84421: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cec09d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cec07c0> <<< 11661 1726882373.84443: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cec08b0> <<< 11661 1726882373.84475: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 11661 1726882373.84674: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cec0d00> <<< 11661 1726882373.84702: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cecb250> <<< 11661 1726882373.84713: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cec0940> <<< 11661 1726882373.84721: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36ceb4a90> <<< 11661 1726882373.84743: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfe7610> <<< 11661 1726882373.84764: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 11661 1726882373.84827: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 11661 1726882373.84857: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cec0af0> <<< 11661 1726882373.84958: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 11661 1726882373.84966: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fd36cde16d0> <<< 11661 1726882373.85106: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip' # zipimport: zlib available <<< 11661 1726882373.85198: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.85212: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/__init__.py <<< 11661 1726882373.85222: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.85237: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.85251: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/__init__.py <<< 11661 1726882373.85254: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.86462: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.87407: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py <<< 11661 1726882373.87426: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7d6820> <<< 11661 1726882373.87429: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py <<< 11661 1726882373.87433: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882373.87475: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py <<< 11661 1726882373.87479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 11661 1726882373.87482: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py <<< 11661 1726882373.87484: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 11661 1726882373.87506: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 11661 1726882373.87512: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c7d6160> <<< 11661 1726882373.87580: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7d6280> <<< 11661 1726882373.87598: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7d6f70> <<< 11661 1726882373.87602: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 11661 1726882373.87644: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 11661 1726882373.87648: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7d64f0> <<< 11661 1726882373.87651: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7d6d90> <<< 11661 1726882373.87680: stdout chunk (state=3): >>>import 'atexit' # <<< 11661 1726882373.87685: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c7d6fd0> <<< 11661 1726882373.87708: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 11661 1726882373.87732: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 11661 1726882373.87781: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7d6100> <<< 11661 1726882373.87787: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 11661 1726882373.87815: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 11661 1726882373.87821: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 11661 1726882373.87860: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 11661 1726882373.87865: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 11661 1726882373.87873: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 11661 1726882373.87944: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c72df40> <<< 11661 1726882373.87979: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c74cd00> <<< 11661 1726882373.88016: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c74ceb0> <<< 11661 1726882373.88024: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 11661 1726882373.88054: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 11661 1726882373.88091: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c74c370> <<< 11661 1726882373.88102: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cd6edc0> <<< 11661 1726882373.88277: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cd6e3a0> <<< 11661 1726882373.88281: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 11661 1726882373.88290: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 11661 1726882373.88300: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cd6efd0> <<< 11661 1726882373.88331: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 11661 1726882373.88334: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 11661 1726882373.88386: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 11661 1726882373.88391: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 11661 1726882373.88396: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 11661 1726882373.88412: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 11661 1726882373.88423: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cd3ed30> <<< 11661 1726882373.88515: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7a9d30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7a9400> <<< 11661 1726882373.88521: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7de4f0> <<< 11661 1726882373.88546: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 11661 1726882373.88553: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c7a9520> <<< 11661 1726882373.88569: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7a9550> <<< 11661 1726882373.88606: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 11661 1726882373.88609: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 11661 1726882373.88632: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 11661 1726882373.88653: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 11661 1726882373.88724: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c71dfd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cd4f250> <<< 11661 1726882373.88756: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 11661 1726882373.88759: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 11661 1726882373.88820: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c71a850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cd4f3d0> <<< 11661 1726882373.88835: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 11661 1726882373.88877: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882373.88903: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 11661 1726882373.88906: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 11661 1726882373.88969: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cd67e50> <<< 11661 1726882373.89095: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c71a7f0> <<< 11661 1726882373.89195: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c71a640> <<< 11661 1726882373.89220: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 11661 1726882373.89223: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c7195b0> <<< 11661 1726882373.89273: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c70ed90> <<< 11661 1726882373.89277: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cd48910> <<< 11661 1726882373.89298: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 11661 1726882373.89322: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 11661 1726882373.89325: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 11661 1726882373.89371: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c79f6a0> <<< 11661 1726882373.89567: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c79db20> <<< 11661 1726882373.89573: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7ad0a0> <<< 11661 1726882373.89597: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c79f100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7e2b20> <<< 11661 1726882373.89614: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11661 1726882373.89640: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 11661 1726882373.89643: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.89720: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.89791: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.89814: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py <<< 11661 1726882373.89851: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11661 1726882373.89855: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py <<< 11661 1726882373.89857: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.89957: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.90047: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.90497: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.90958: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py <<< 11661 1726882373.90964: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 11661 1726882373.90992: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 11661 1726882373.90994: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882373.91053: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c2f45e0> <<< 11661 1726882373.91125: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 11661 1726882373.91128: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c6e7580> <<< 11661 1726882373.91138: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c295100> <<< 11661 1726882373.91189: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py <<< 11661 1726882373.91192: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.91205: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.91230: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/_text.py <<< 11661 1726882373.91233: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.91347: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.91480: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 11661 1726882373.91500: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c79db80> <<< 11661 1726882373.91511: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.91894: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.92278: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.92307: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.92382: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 11661 1726882373.92410: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.92442: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py <<< 11661 1726882373.92455: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.92509: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.92579: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/errors.py <<< 11661 1726882373.92604: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py <<< 11661 1726882373.92618: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.92651: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.92685: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 11661 1726882373.92707: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.92875: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.93067: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 11661 1726882373.93095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 11661 1726882373.93100: stdout chunk (state=3): >>>import '_ast' # <<< 11661 1726882373.93177: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c2c6f10> # zipimport: zlib available <<< 11661 1726882373.93240: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.93303: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py <<< 11661 1726882373.93322: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 11661 1726882373.93334: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.93373: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.93405: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/locale.py <<< 11661 1726882373.93418: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.93451: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.93490: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.93583: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.93640: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 11661 1726882373.93668: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 11661 1726882373.93739: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cd5a220> <<< 11661 1726882373.93774: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c2c6850> <<< 11661 1726882373.93812: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/process.py <<< 11661 1726882373.93819: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.93957: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.94005: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.94029: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.94072: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 11661 1726882373.94087: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 11661 1726882373.94099: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 11661 1726882373.94137: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 11661 1726882373.94152: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 11661 1726882373.94172: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 11661 1726882373.94259: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c6ddca0> <<< 11661 1726882373.94298: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c6d9f70> <<< 11661 1726882373.94356: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c6d2940> <<< 11661 1726882373.94378: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 11661 1726882373.94396: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.94411: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 11661 1726882373.94474: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/basic.py <<< 11661 1726882373.94502: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11661 1726882373.94516: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 11661 1726882373.94524: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.94641: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.94809: stdout chunk (state=3): >>># zipimport: zlib available <<< 11661 1726882373.94948: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 11661 1726882373.95232: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib <<< 11661 1726882373.95255: stdout chunk (state=3): >>># cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ <<< 11661 1726882373.95282: stdout chunk (state=3): >>># cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword <<< 11661 1726882373.95297: stdout chunk (state=3): >>># destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc <<< 11661 1726882373.95311: stdout chunk (state=3): >>># cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize <<< 11661 1726882373.95334: stdout chunk (state=3): >>># cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy <<< 11661 1726882373.95345: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 11661 1726882373.95527: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 11661 1726882373.95559: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 11661 1726882373.95586: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma <<< 11661 1726882373.95608: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 11661 1726882373.95625: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy array <<< 11661 1726882373.95642: stdout chunk (state=3): >>># destroy datetime <<< 11661 1726882373.95672: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse <<< 11661 1726882373.95692: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian <<< 11661 1726882373.95718: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize <<< 11661 1726882373.95744: stdout chunk (state=3): >>># cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch <<< 11661 1726882373.95774: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 11661 1726882373.95792: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq <<< 11661 1726882373.95810: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types <<< 11661 1726882373.95827: stdout chunk (state=3): >>># cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath <<< 11661 1726882373.95855: stdout chunk (state=3): >>># cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 11661 1726882373.95878: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 11661 1726882373.96021: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 11661 1726882373.96043: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath <<< 11661 1726882373.96071: stdout chunk (state=3): >>># destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select<<< 11661 1726882373.96090: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 11661 1726882373.96116: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 11661 1726882373.96534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882373.96543: stdout chunk (state=3): >>><<< 11661 1726882373.96568: stderr chunk (state=3): >>><<< 11661 1726882373.96680: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d373dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d3183a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d373b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d373ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d318490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d318940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d318670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d2cf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d2cf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d2f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d2cf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d330880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d2c8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d2f2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d318970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d26eeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d270f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d267610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d26d640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d26e370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cfd4e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfd4910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfd4f10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfd4fd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfe70d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d249d90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d242670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d2546d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d275e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cfe7cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d2492b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36d2542e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d27b9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfe7eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfe7df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfe7d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfba3d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfba4c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfeef40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfe9a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfe9490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cee3220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfa5520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfe9f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36d27b040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cef5b50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cef5e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cf06790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cf06cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36ce94400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cef5f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cea52e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cf06610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cea53a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfe7a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cec0700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cec09d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cec07c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cec08b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cec0d00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cecb250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cec0940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36ceb4a90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cfe7610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cec0af0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fd36cde16d0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7d6820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c7d6160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7d6280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7d6f70> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7d64f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7d6d90> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c7d6fd0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7d6100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c72df40> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c74cd00> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c74ceb0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c74c370> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cd6edc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cd6e3a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cd6efd0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cd3ed30> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7a9d30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7a9400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7de4f0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c7a9520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7a9550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c71dfd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cd4f250> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c71a850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cd4f3d0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cd67e50> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c71a7f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c71a640> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c7195b0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c70ed90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36cd48910> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c79f6a0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c79db20> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7ad0a0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c79f100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c7e2b20> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36c2f45e0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c6e7580> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c295100> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c79db80> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c2c6f10> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd36cd5a220> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c2c6850> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c6ddca0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c6d9f70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd36c6d2940> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_pk4jlht_/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 11661 1726882373.97825: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882373.4822066-11815-212166598929980/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882373.97828: _low_level_execute_command(): starting 11661 1726882373.97831: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882373.4822066-11815-212166598929980/ > /dev/null 2>&1 && sleep 0' 11661 1726882373.97969: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882373.97973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882373.97975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882373.97978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882373.98014: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882373.98018: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882373.98020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882373.98037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882373.98040: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882373.98042: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882373.98123: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882373.98126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882373.98128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882373.98130: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882373.98132: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882373.98134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882373.98156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882373.98174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882373.98184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882373.98310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882374.00214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882374.00234: stderr chunk (state=3): >>><<< 11661 1726882374.00237: stdout chunk (state=3): >>><<< 11661 1726882374.00624: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882374.00627: handler run complete 11661 1726882374.00629: attempt loop complete, returning result 11661 1726882374.00631: _execute() done 11661 1726882374.00633: dumping result to json 11661 1726882374.00635: done dumping result, returning 11661 1726882374.00637: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [0e448fcc-3ce9-896b-2321-0000000000df] 11661 1726882374.00639: sending task result for task 0e448fcc-3ce9-896b-2321-0000000000df 11661 1726882374.00701: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000000df 11661 1726882374.00703: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 11661 1726882374.00771: no more pending results, returning what we have 11661 1726882374.00774: results queue empty 11661 1726882374.00775: checking for any_errors_fatal 11661 1726882374.00783: done checking for any_errors_fatal 11661 1726882374.00784: checking for max_fail_percentage 11661 1726882374.00785: done checking for max_fail_percentage 11661 1726882374.00786: checking to see if all hosts have failed and the running result is not ok 11661 1726882374.00787: done checking to see if all hosts have failed 11661 1726882374.00788: getting the remaining hosts for this loop 11661 1726882374.00789: done getting the remaining hosts for this loop 11661 1726882374.00792: getting the next task for host managed_node2 11661 1726882374.00798: done getting next task for host managed_node2 11661 1726882374.00801: ^ task is: TASK: Set flag to indicate system is ostree 11661 1726882374.00804: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882374.00807: getting variables 11661 1726882374.00809: in VariableManager get_vars() 11661 1726882374.00836: Calling all_inventory to load vars for managed_node2 11661 1726882374.00839: Calling groups_inventory to load vars for managed_node2 11661 1726882374.00842: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882374.00854: Calling all_plugins_play to load vars for managed_node2 11661 1726882374.00858: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882374.00861: Calling groups_plugins_play to load vars for managed_node2 11661 1726882374.01034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882374.01229: done with get_vars() 11661 1726882374.01239: done getting variables 11661 1726882374.01336: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:32:54 -0400 (0:00:00.630) 0:00:02.727 ****** 11661 1726882374.01376: entering _queue_task() for managed_node2/set_fact 11661 1726882374.01378: Creating lock for set_fact 11661 1726882374.01628: worker is 1 (out of 1 available) 11661 1726882374.01639: exiting _queue_task() for managed_node2/set_fact 11661 1726882374.01653: done queuing things up, now waiting for results queue to drain 11661 1726882374.01656: waiting for pending results... 11661 1726882374.01910: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 11661 1726882374.02017: in run() - task 0e448fcc-3ce9-896b-2321-0000000000e0 11661 1726882374.02036: variable 'ansible_search_path' from source: unknown 11661 1726882374.02044: variable 'ansible_search_path' from source: unknown 11661 1726882374.02092: calling self._execute() 11661 1726882374.02169: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882374.02181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882374.02193: variable 'omit' from source: magic vars 11661 1726882374.02726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11661 1726882374.02958: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11661 1726882374.03006: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11661 1726882374.03039: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11661 1726882374.03083: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11661 1726882374.03185: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11661 1726882374.03215: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11661 1726882374.03246: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882374.03286: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11661 1726882374.03429: Evaluated conditional (not __network_is_ostree is defined): True 11661 1726882374.03440: variable 'omit' from source: magic vars 11661 1726882374.03490: variable 'omit' from source: magic vars 11661 1726882374.03621: variable '__ostree_booted_stat' from source: set_fact 11661 1726882374.03681: variable 'omit' from source: magic vars 11661 1726882374.03711: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882374.03747: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882374.03777: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882374.03799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882374.03815: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882374.03857: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882374.03869: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882374.03878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882374.03989: Set connection var ansible_connection to ssh 11661 1726882374.04000: Set connection var ansible_pipelining to False 11661 1726882374.04010: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882374.04023: Set connection var ansible_timeout to 10 11661 1726882374.04030: Set connection var ansible_shell_type to sh 11661 1726882374.04041: Set connection var ansible_shell_executable to /bin/sh 11661 1726882374.04075: variable 'ansible_shell_executable' from source: unknown 11661 1726882374.04083: variable 'ansible_connection' from source: unknown 11661 1726882374.04090: variable 'ansible_module_compression' from source: unknown 11661 1726882374.04096: variable 'ansible_shell_type' from source: unknown 11661 1726882374.04103: variable 'ansible_shell_executable' from source: unknown 11661 1726882374.04109: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882374.04117: variable 'ansible_pipelining' from source: unknown 11661 1726882374.04123: variable 'ansible_timeout' from source: unknown 11661 1726882374.04131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882374.04242: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882374.04262: variable 'omit' from source: magic vars 11661 1726882374.04276: starting attempt loop 11661 1726882374.04282: running the handler 11661 1726882374.04301: handler run complete 11661 1726882374.04314: attempt loop complete, returning result 11661 1726882374.04320: _execute() done 11661 1726882374.04326: dumping result to json 11661 1726882374.04332: done dumping result, returning 11661 1726882374.04343: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [0e448fcc-3ce9-896b-2321-0000000000e0] 11661 1726882374.04356: sending task result for task 0e448fcc-3ce9-896b-2321-0000000000e0 11661 1726882374.04465: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000000e0 11661 1726882374.04474: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 11661 1726882374.04542: no more pending results, returning what we have 11661 1726882374.04545: results queue empty 11661 1726882374.04546: checking for any_errors_fatal 11661 1726882374.04557: done checking for any_errors_fatal 11661 1726882374.04558: checking for max_fail_percentage 11661 1726882374.04560: done checking for max_fail_percentage 11661 1726882374.04560: checking to see if all hosts have failed and the running result is not ok 11661 1726882374.04561: done checking to see if all hosts have failed 11661 1726882374.04562: getting the remaining hosts for this loop 11661 1726882374.04565: done getting the remaining hosts for this loop 11661 1726882374.04569: getting the next task for host managed_node2 11661 1726882374.04581: done getting next task for host managed_node2 11661 1726882374.04583: ^ task is: TASK: Fix CentOS6 Base repo 11661 1726882374.04586: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882374.04591: getting variables 11661 1726882374.04592: in VariableManager get_vars() 11661 1726882374.04623: Calling all_inventory to load vars for managed_node2 11661 1726882374.04626: Calling groups_inventory to load vars for managed_node2 11661 1726882374.04629: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882374.04640: Calling all_plugins_play to load vars for managed_node2 11661 1726882374.04643: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882374.04658: Calling groups_plugins_play to load vars for managed_node2 11661 1726882374.04876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882374.05071: done with get_vars() 11661 1726882374.05081: done getting variables 11661 1726882374.05402: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:32:54 -0400 (0:00:00.040) 0:00:02.767 ****** 11661 1726882374.05430: entering _queue_task() for managed_node2/copy 11661 1726882374.05690: worker is 1 (out of 1 available) 11661 1726882374.05700: exiting _queue_task() for managed_node2/copy 11661 1726882374.05711: done queuing things up, now waiting for results queue to drain 11661 1726882374.05713: waiting for pending results... 11661 1726882374.05954: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 11661 1726882374.06051: in run() - task 0e448fcc-3ce9-896b-2321-0000000000e2 11661 1726882374.06071: variable 'ansible_search_path' from source: unknown 11661 1726882374.06077: variable 'ansible_search_path' from source: unknown 11661 1726882374.06112: calling self._execute() 11661 1726882374.06190: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882374.06200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882374.06212: variable 'omit' from source: magic vars 11661 1726882374.06671: variable 'ansible_distribution' from source: facts 11661 1726882374.06701: Evaluated conditional (ansible_distribution == 'CentOS'): True 11661 1726882374.06822: variable 'ansible_distribution_major_version' from source: facts 11661 1726882374.06833: Evaluated conditional (ansible_distribution_major_version == '6'): False 11661 1726882374.06841: when evaluation is False, skipping this task 11661 1726882374.06848: _execute() done 11661 1726882374.06858: dumping result to json 11661 1726882374.06869: done dumping result, returning 11661 1726882374.06879: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [0e448fcc-3ce9-896b-2321-0000000000e2] 11661 1726882374.06888: sending task result for task 0e448fcc-3ce9-896b-2321-0000000000e2 11661 1726882374.07002: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000000e2 11661 1726882374.07010: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11661 1726882374.07087: no more pending results, returning what we have 11661 1726882374.07091: results queue empty 11661 1726882374.07092: checking for any_errors_fatal 11661 1726882374.07096: done checking for any_errors_fatal 11661 1726882374.07097: checking for max_fail_percentage 11661 1726882374.07099: done checking for max_fail_percentage 11661 1726882374.07099: checking to see if all hosts have failed and the running result is not ok 11661 1726882374.07100: done checking to see if all hosts have failed 11661 1726882374.07101: getting the remaining hosts for this loop 11661 1726882374.07102: done getting the remaining hosts for this loop 11661 1726882374.07106: getting the next task for host managed_node2 11661 1726882374.07113: done getting next task for host managed_node2 11661 1726882374.07116: ^ task is: TASK: Include the task 'enable_epel.yml' 11661 1726882374.07119: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882374.07122: getting variables 11661 1726882374.07124: in VariableManager get_vars() 11661 1726882374.07154: Calling all_inventory to load vars for managed_node2 11661 1726882374.07157: Calling groups_inventory to load vars for managed_node2 11661 1726882374.07160: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882374.07174: Calling all_plugins_play to load vars for managed_node2 11661 1726882374.07178: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882374.07181: Calling groups_plugins_play to load vars for managed_node2 11661 1726882374.07366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882374.07591: done with get_vars() 11661 1726882374.07601: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:32:54 -0400 (0:00:00.022) 0:00:02.790 ****** 11661 1726882374.07718: entering _queue_task() for managed_node2/include_tasks 11661 1726882374.08111: worker is 1 (out of 1 available) 11661 1726882374.08123: exiting _queue_task() for managed_node2/include_tasks 11661 1726882374.08134: done queuing things up, now waiting for results queue to drain 11661 1726882374.08135: waiting for pending results... 11661 1726882374.08372: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 11661 1726882374.08460: in run() - task 0e448fcc-3ce9-896b-2321-0000000000e3 11661 1726882374.08479: variable 'ansible_search_path' from source: unknown 11661 1726882374.08490: variable 'ansible_search_path' from source: unknown 11661 1726882374.08526: calling self._execute() 11661 1726882374.08601: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882374.08613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882374.08627: variable 'omit' from source: magic vars 11661 1726882374.09129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882374.11487: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882374.11572: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882374.11610: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882374.11651: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882374.11685: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882374.11771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882374.11803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882374.11831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882374.11885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882374.11902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882374.12019: variable '__network_is_ostree' from source: set_fact 11661 1726882374.12042: Evaluated conditional (not __network_is_ostree | d(false)): True 11661 1726882374.12055: _execute() done 11661 1726882374.12062: dumping result to json 11661 1726882374.12079: done dumping result, returning 11661 1726882374.12088: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [0e448fcc-3ce9-896b-2321-0000000000e3] 11661 1726882374.12097: sending task result for task 0e448fcc-3ce9-896b-2321-0000000000e3 11661 1726882374.12218: no more pending results, returning what we have 11661 1726882374.12224: in VariableManager get_vars() 11661 1726882374.12263: Calling all_inventory to load vars for managed_node2 11661 1726882374.12269: Calling groups_inventory to load vars for managed_node2 11661 1726882374.12278: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882374.12291: Calling all_plugins_play to load vars for managed_node2 11661 1726882374.12295: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882374.12298: Calling groups_plugins_play to load vars for managed_node2 11661 1726882374.12479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882374.12675: done with get_vars() 11661 1726882374.12683: variable 'ansible_search_path' from source: unknown 11661 1726882374.12684: variable 'ansible_search_path' from source: unknown 11661 1726882374.12721: we have included files to process 11661 1726882374.12723: generating all_blocks data 11661 1726882374.12725: done generating all_blocks data 11661 1726882374.12733: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11661 1726882374.12734: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11661 1726882374.12737: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11661 1726882374.13310: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000000e3 11661 1726882374.13314: WORKER PROCESS EXITING 11661 1726882374.13946: done processing included file 11661 1726882374.13952: iterating over new_blocks loaded from include file 11661 1726882374.13953: in VariableManager get_vars() 11661 1726882374.13968: done with get_vars() 11661 1726882374.13969: filtering new block on tags 11661 1726882374.13992: done filtering new block on tags 11661 1726882374.13995: in VariableManager get_vars() 11661 1726882374.14005: done with get_vars() 11661 1726882374.14006: filtering new block on tags 11661 1726882374.14017: done filtering new block on tags 11661 1726882374.14019: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 11661 1726882374.14025: extending task lists for all hosts with included blocks 11661 1726882374.14126: done extending task lists 11661 1726882374.14128: done processing included files 11661 1726882374.14129: results queue empty 11661 1726882374.14129: checking for any_errors_fatal 11661 1726882374.14132: done checking for any_errors_fatal 11661 1726882374.14133: checking for max_fail_percentage 11661 1726882374.14134: done checking for max_fail_percentage 11661 1726882374.14135: checking to see if all hosts have failed and the running result is not ok 11661 1726882374.14136: done checking to see if all hosts have failed 11661 1726882374.14136: getting the remaining hosts for this loop 11661 1726882374.14137: done getting the remaining hosts for this loop 11661 1726882374.14140: getting the next task for host managed_node2 11661 1726882374.14143: done getting next task for host managed_node2 11661 1726882374.14145: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 11661 1726882374.14151: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882374.14153: getting variables 11661 1726882374.14154: in VariableManager get_vars() 11661 1726882374.14161: Calling all_inventory to load vars for managed_node2 11661 1726882374.14468: Calling groups_inventory to load vars for managed_node2 11661 1726882374.14472: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882374.14477: Calling all_plugins_play to load vars for managed_node2 11661 1726882374.14484: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882374.14487: Calling groups_plugins_play to load vars for managed_node2 11661 1726882374.14604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882374.14963: done with get_vars() 11661 1726882374.14973: done getting variables 11661 1726882374.15042: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 11661 1726882374.15283: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:32:54 -0400 (0:00:00.076) 0:00:02.866 ****** 11661 1726882374.15331: entering _queue_task() for managed_node2/command 11661 1726882374.15333: Creating lock for command 11661 1726882374.15641: worker is 1 (out of 1 available) 11661 1726882374.15654: exiting _queue_task() for managed_node2/command 11661 1726882374.15669: done queuing things up, now waiting for results queue to drain 11661 1726882374.15671: waiting for pending results... 11661 1726882374.15933: running TaskExecutor() for managed_node2/TASK: Create EPEL 9 11661 1726882374.16058: in run() - task 0e448fcc-3ce9-896b-2321-0000000000fd 11661 1726882374.16079: variable 'ansible_search_path' from source: unknown 11661 1726882374.16087: variable 'ansible_search_path' from source: unknown 11661 1726882374.16133: calling self._execute() 11661 1726882374.16214: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882374.16230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882374.16245: variable 'omit' from source: magic vars 11661 1726882374.16982: variable 'ansible_distribution' from source: facts 11661 1726882374.17000: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11661 1726882374.17138: variable 'ansible_distribution_major_version' from source: facts 11661 1726882374.17152: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11661 1726882374.17160: when evaluation is False, skipping this task 11661 1726882374.17170: _execute() done 11661 1726882374.17176: dumping result to json 11661 1726882374.17184: done dumping result, returning 11661 1726882374.17200: done running TaskExecutor() for managed_node2/TASK: Create EPEL 9 [0e448fcc-3ce9-896b-2321-0000000000fd] 11661 1726882374.17211: sending task result for task 0e448fcc-3ce9-896b-2321-0000000000fd skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11661 1726882374.17379: no more pending results, returning what we have 11661 1726882374.17382: results queue empty 11661 1726882374.17383: checking for any_errors_fatal 11661 1726882374.17385: done checking for any_errors_fatal 11661 1726882374.17385: checking for max_fail_percentage 11661 1726882374.17387: done checking for max_fail_percentage 11661 1726882374.17388: checking to see if all hosts have failed and the running result is not ok 11661 1726882374.17389: done checking to see if all hosts have failed 11661 1726882374.17389: getting the remaining hosts for this loop 11661 1726882374.17391: done getting the remaining hosts for this loop 11661 1726882374.17394: getting the next task for host managed_node2 11661 1726882374.17402: done getting next task for host managed_node2 11661 1726882374.17404: ^ task is: TASK: Install yum-utils package 11661 1726882374.17408: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882374.17412: getting variables 11661 1726882374.17413: in VariableManager get_vars() 11661 1726882374.17442: Calling all_inventory to load vars for managed_node2 11661 1726882374.17445: Calling groups_inventory to load vars for managed_node2 11661 1726882374.17450: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882374.17464: Calling all_plugins_play to load vars for managed_node2 11661 1726882374.17467: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882374.17471: Calling groups_plugins_play to load vars for managed_node2 11661 1726882374.18135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882374.18327: done with get_vars() 11661 1726882374.18339: done getting variables 11661 1726882374.18476: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:32:54 -0400 (0:00:00.031) 0:00:02.898 ****** 11661 1726882374.18514: entering _queue_task() for managed_node2/package 11661 1726882374.18516: Creating lock for package 11661 1726882374.18980: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000000fd 11661 1726882374.18988: WORKER PROCESS EXITING 11661 1726882374.19492: worker is 1 (out of 1 available) 11661 1726882374.19507: exiting _queue_task() for managed_node2/package 11661 1726882374.19519: done queuing things up, now waiting for results queue to drain 11661 1726882374.19521: waiting for pending results... 11661 1726882374.19778: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 11661 1726882374.19896: in run() - task 0e448fcc-3ce9-896b-2321-0000000000fe 11661 1726882374.19912: variable 'ansible_search_path' from source: unknown 11661 1726882374.19919: variable 'ansible_search_path' from source: unknown 11661 1726882374.19961: calling self._execute() 11661 1726882374.20035: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882374.20047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882374.20068: variable 'omit' from source: magic vars 11661 1726882374.20499: variable 'ansible_distribution' from source: facts 11661 1726882374.20523: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11661 1726882374.20651: variable 'ansible_distribution_major_version' from source: facts 11661 1726882374.20665: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11661 1726882374.20673: when evaluation is False, skipping this task 11661 1726882374.20680: _execute() done 11661 1726882374.20686: dumping result to json 11661 1726882374.20694: done dumping result, returning 11661 1726882374.20704: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [0e448fcc-3ce9-896b-2321-0000000000fe] 11661 1726882374.20714: sending task result for task 0e448fcc-3ce9-896b-2321-0000000000fe 11661 1726882374.20838: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000000fe skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11661 1726882374.20894: no more pending results, returning what we have 11661 1726882374.20898: results queue empty 11661 1726882374.20899: checking for any_errors_fatal 11661 1726882374.20908: done checking for any_errors_fatal 11661 1726882374.20909: checking for max_fail_percentage 11661 1726882374.20911: done checking for max_fail_percentage 11661 1726882374.20912: checking to see if all hosts have failed and the running result is not ok 11661 1726882374.20913: done checking to see if all hosts have failed 11661 1726882374.20914: getting the remaining hosts for this loop 11661 1726882374.20916: done getting the remaining hosts for this loop 11661 1726882374.20920: getting the next task for host managed_node2 11661 1726882374.20928: done getting next task for host managed_node2 11661 1726882374.20930: ^ task is: TASK: Enable EPEL 7 11661 1726882374.20935: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882374.20938: getting variables 11661 1726882374.20940: in VariableManager get_vars() 11661 1726882374.21028: Calling all_inventory to load vars for managed_node2 11661 1726882374.21032: Calling groups_inventory to load vars for managed_node2 11661 1726882374.21037: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882374.21054: Calling all_plugins_play to load vars for managed_node2 11661 1726882374.21058: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882374.21062: Calling groups_plugins_play to load vars for managed_node2 11661 1726882374.21233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882374.21444: done with get_vars() 11661 1726882374.21459: done getting variables 11661 1726882374.21717: WORKER PROCESS EXITING 11661 1726882374.21753: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:32:54 -0400 (0:00:00.032) 0:00:02.931 ****** 11661 1726882374.21786: entering _queue_task() for managed_node2/command 11661 1726882374.22012: worker is 1 (out of 1 available) 11661 1726882374.22023: exiting _queue_task() for managed_node2/command 11661 1726882374.22035: done queuing things up, now waiting for results queue to drain 11661 1726882374.22037: waiting for pending results... 11661 1726882374.22293: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 11661 1726882374.22403: in run() - task 0e448fcc-3ce9-896b-2321-0000000000ff 11661 1726882374.22421: variable 'ansible_search_path' from source: unknown 11661 1726882374.22428: variable 'ansible_search_path' from source: unknown 11661 1726882374.22472: calling self._execute() 11661 1726882374.22548: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882374.22563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882374.22579: variable 'omit' from source: magic vars 11661 1726882374.22973: variable 'ansible_distribution' from source: facts 11661 1726882374.22991: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11661 1726882374.23120: variable 'ansible_distribution_major_version' from source: facts 11661 1726882374.23134: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11661 1726882374.23141: when evaluation is False, skipping this task 11661 1726882374.23147: _execute() done 11661 1726882374.23155: dumping result to json 11661 1726882374.23161: done dumping result, returning 11661 1726882374.23171: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [0e448fcc-3ce9-896b-2321-0000000000ff] 11661 1726882374.23179: sending task result for task 0e448fcc-3ce9-896b-2321-0000000000ff skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11661 1726882374.23314: no more pending results, returning what we have 11661 1726882374.23318: results queue empty 11661 1726882374.23318: checking for any_errors_fatal 11661 1726882374.23325: done checking for any_errors_fatal 11661 1726882374.23325: checking for max_fail_percentage 11661 1726882374.23327: done checking for max_fail_percentage 11661 1726882374.23328: checking to see if all hosts have failed and the running result is not ok 11661 1726882374.23329: done checking to see if all hosts have failed 11661 1726882374.23329: getting the remaining hosts for this loop 11661 1726882374.23331: done getting the remaining hosts for this loop 11661 1726882374.23334: getting the next task for host managed_node2 11661 1726882374.23341: done getting next task for host managed_node2 11661 1726882374.23343: ^ task is: TASK: Enable EPEL 8 11661 1726882374.23346: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882374.23352: getting variables 11661 1726882374.23354: in VariableManager get_vars() 11661 1726882374.23382: Calling all_inventory to load vars for managed_node2 11661 1726882374.23385: Calling groups_inventory to load vars for managed_node2 11661 1726882374.23388: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882374.23400: Calling all_plugins_play to load vars for managed_node2 11661 1726882374.23403: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882374.23406: Calling groups_plugins_play to load vars for managed_node2 11661 1726882374.23605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882374.23832: done with get_vars() 11661 1726882374.23841: done getting variables 11661 1726882374.23913: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:32:54 -0400 (0:00:00.021) 0:00:02.952 ****** 11661 1726882374.23946: entering _queue_task() for managed_node2/command 11661 1726882374.23968: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000000ff 11661 1726882374.23978: WORKER PROCESS EXITING 11661 1726882374.24351: worker is 1 (out of 1 available) 11661 1726882374.24362: exiting _queue_task() for managed_node2/command 11661 1726882374.24376: done queuing things up, now waiting for results queue to drain 11661 1726882374.24377: waiting for pending results... 11661 1726882374.24614: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 11661 1726882374.24721: in run() - task 0e448fcc-3ce9-896b-2321-000000000100 11661 1726882374.24740: variable 'ansible_search_path' from source: unknown 11661 1726882374.24752: variable 'ansible_search_path' from source: unknown 11661 1726882374.24791: calling self._execute() 11661 1726882374.25538: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882374.25551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882374.25568: variable 'omit' from source: magic vars 11661 1726882374.26343: variable 'ansible_distribution' from source: facts 11661 1726882374.26482: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11661 1726882374.26728: variable 'ansible_distribution_major_version' from source: facts 11661 1726882374.26795: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11661 1726882374.26803: when evaluation is False, skipping this task 11661 1726882374.26811: _execute() done 11661 1726882374.26817: dumping result to json 11661 1726882374.26825: done dumping result, returning 11661 1726882374.26833: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [0e448fcc-3ce9-896b-2321-000000000100] 11661 1726882374.26843: sending task result for task 0e448fcc-3ce9-896b-2321-000000000100 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11661 1726882374.26992: no more pending results, returning what we have 11661 1726882374.26997: results queue empty 11661 1726882374.26998: checking for any_errors_fatal 11661 1726882374.27004: done checking for any_errors_fatal 11661 1726882374.27004: checking for max_fail_percentage 11661 1726882374.27006: done checking for max_fail_percentage 11661 1726882374.27007: checking to see if all hosts have failed and the running result is not ok 11661 1726882374.27008: done checking to see if all hosts have failed 11661 1726882374.27009: getting the remaining hosts for this loop 11661 1726882374.27011: done getting the remaining hosts for this loop 11661 1726882374.27014: getting the next task for host managed_node2 11661 1726882374.27025: done getting next task for host managed_node2 11661 1726882374.27028: ^ task is: TASK: Enable EPEL 6 11661 1726882374.27032: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882374.27037: getting variables 11661 1726882374.27038: in VariableManager get_vars() 11661 1726882374.27120: Calling all_inventory to load vars for managed_node2 11661 1726882374.27123: Calling groups_inventory to load vars for managed_node2 11661 1726882374.27127: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882374.27140: Calling all_plugins_play to load vars for managed_node2 11661 1726882374.27144: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882374.27147: Calling groups_plugins_play to load vars for managed_node2 11661 1726882374.27320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882374.27523: done with get_vars() 11661 1726882374.27532: done getting variables 11661 1726882374.28113: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:32:54 -0400 (0:00:00.041) 0:00:02.994 ****** 11661 1726882374.28148: entering _queue_task() for managed_node2/copy 11661 1726882374.28171: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000100 11661 1726882374.28180: WORKER PROCESS EXITING 11661 1726882374.28583: worker is 1 (out of 1 available) 11661 1726882374.28594: exiting _queue_task() for managed_node2/copy 11661 1726882374.28604: done queuing things up, now waiting for results queue to drain 11661 1726882374.28605: waiting for pending results... 11661 1726882374.29060: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 11661 1726882374.29153: in run() - task 0e448fcc-3ce9-896b-2321-000000000102 11661 1726882374.29172: variable 'ansible_search_path' from source: unknown 11661 1726882374.29182: variable 'ansible_search_path' from source: unknown 11661 1726882374.29218: calling self._execute() 11661 1726882374.29296: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882374.29308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882374.29323: variable 'omit' from source: magic vars 11661 1726882374.29696: variable 'ansible_distribution' from source: facts 11661 1726882374.29713: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11661 1726882374.29835: variable 'ansible_distribution_major_version' from source: facts 11661 1726882374.29846: Evaluated conditional (ansible_distribution_major_version == '6'): False 11661 1726882374.29857: when evaluation is False, skipping this task 11661 1726882374.29867: _execute() done 11661 1726882374.29874: dumping result to json 11661 1726882374.29882: done dumping result, returning 11661 1726882374.29892: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [0e448fcc-3ce9-896b-2321-000000000102] 11661 1726882374.29903: sending task result for task 0e448fcc-3ce9-896b-2321-000000000102 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11661 1726882374.30043: no more pending results, returning what we have 11661 1726882374.30047: results queue empty 11661 1726882374.30050: checking for any_errors_fatal 11661 1726882374.30056: done checking for any_errors_fatal 11661 1726882374.30057: checking for max_fail_percentage 11661 1726882374.30058: done checking for max_fail_percentage 11661 1726882374.30060: checking to see if all hosts have failed and the running result is not ok 11661 1726882374.30060: done checking to see if all hosts have failed 11661 1726882374.30061: getting the remaining hosts for this loop 11661 1726882374.30064: done getting the remaining hosts for this loop 11661 1726882374.30068: getting the next task for host managed_node2 11661 1726882374.30078: done getting next task for host managed_node2 11661 1726882374.30081: ^ task is: TASK: Set network provider to 'nm' 11661 1726882374.30083: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882374.30087: getting variables 11661 1726882374.30089: in VariableManager get_vars() 11661 1726882374.30117: Calling all_inventory to load vars for managed_node2 11661 1726882374.30119: Calling groups_inventory to load vars for managed_node2 11661 1726882374.30123: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882374.30135: Calling all_plugins_play to load vars for managed_node2 11661 1726882374.30138: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882374.30142: Calling groups_plugins_play to load vars for managed_node2 11661 1726882374.30324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882374.30519: done with get_vars() 11661 1726882374.30529: done getting variables 11661 1726882374.30603: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:13 Friday 20 September 2024 21:32:54 -0400 (0:00:00.024) 0:00:03.019 ****** 11661 1726882374.30634: entering _queue_task() for managed_node2/set_fact 11661 1726882374.30654: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000102 11661 1726882374.30664: WORKER PROCESS EXITING 11661 1726882374.31007: worker is 1 (out of 1 available) 11661 1726882374.31019: exiting _queue_task() for managed_node2/set_fact 11661 1726882374.31030: done queuing things up, now waiting for results queue to drain 11661 1726882374.31031: waiting for pending results... 11661 1726882374.31256: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 11661 1726882374.31337: in run() - task 0e448fcc-3ce9-896b-2321-000000000007 11661 1726882374.31356: variable 'ansible_search_path' from source: unknown 11661 1726882374.31397: calling self._execute() 11661 1726882374.31522: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882374.31534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882374.31548: variable 'omit' from source: magic vars 11661 1726882374.31665: variable 'omit' from source: magic vars 11661 1726882374.31703: variable 'omit' from source: magic vars 11661 1726882374.31742: variable 'omit' from source: magic vars 11661 1726882374.31792: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882374.31836: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882374.31868: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882374.31890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882374.31909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882374.31945: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882374.31957: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882374.31967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882374.32075: Set connection var ansible_connection to ssh 11661 1726882374.32087: Set connection var ansible_pipelining to False 11661 1726882374.32098: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882374.32110: Set connection var ansible_timeout to 10 11661 1726882374.32117: Set connection var ansible_shell_type to sh 11661 1726882374.32132: Set connection var ansible_shell_executable to /bin/sh 11661 1726882374.32160: variable 'ansible_shell_executable' from source: unknown 11661 1726882374.32170: variable 'ansible_connection' from source: unknown 11661 1726882374.32178: variable 'ansible_module_compression' from source: unknown 11661 1726882374.32185: variable 'ansible_shell_type' from source: unknown 11661 1726882374.32192: variable 'ansible_shell_executable' from source: unknown 11661 1726882374.32200: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882374.32207: variable 'ansible_pipelining' from source: unknown 11661 1726882374.32213: variable 'ansible_timeout' from source: unknown 11661 1726882374.32220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882374.32371: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882374.32388: variable 'omit' from source: magic vars 11661 1726882374.32397: starting attempt loop 11661 1726882374.32404: running the handler 11661 1726882374.32420: handler run complete 11661 1726882374.32434: attempt loop complete, returning result 11661 1726882374.32441: _execute() done 11661 1726882374.32446: dumping result to json 11661 1726882374.32460: done dumping result, returning 11661 1726882374.32474: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [0e448fcc-3ce9-896b-2321-000000000007] 11661 1726882374.32483: sending task result for task 0e448fcc-3ce9-896b-2321-000000000007 ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 11661 1726882374.32672: no more pending results, returning what we have 11661 1726882374.32675: results queue empty 11661 1726882374.32676: checking for any_errors_fatal 11661 1726882374.32682: done checking for any_errors_fatal 11661 1726882374.32682: checking for max_fail_percentage 11661 1726882374.32684: done checking for max_fail_percentage 11661 1726882374.32685: checking to see if all hosts have failed and the running result is not ok 11661 1726882374.32686: done checking to see if all hosts have failed 11661 1726882374.32686: getting the remaining hosts for this loop 11661 1726882374.32688: done getting the remaining hosts for this loop 11661 1726882374.32691: getting the next task for host managed_node2 11661 1726882374.32699: done getting next task for host managed_node2 11661 1726882374.32701: ^ task is: TASK: meta (flush_handlers) 11661 1726882374.32703: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882374.32706: getting variables 11661 1726882374.32708: in VariableManager get_vars() 11661 1726882374.32731: Calling all_inventory to load vars for managed_node2 11661 1726882374.32734: Calling groups_inventory to load vars for managed_node2 11661 1726882374.32737: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882374.32747: Calling all_plugins_play to load vars for managed_node2 11661 1726882374.32753: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882374.32757: Calling groups_plugins_play to load vars for managed_node2 11661 1726882374.32914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882374.33103: done with get_vars() 11661 1726882374.33112: done getting variables 11661 1726882374.33179: in VariableManager get_vars() 11661 1726882374.33187: Calling all_inventory to load vars for managed_node2 11661 1726882374.33190: Calling groups_inventory to load vars for managed_node2 11661 1726882374.33192: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882374.33197: Calling all_plugins_play to load vars for managed_node2 11661 1726882374.33199: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882374.33202: Calling groups_plugins_play to load vars for managed_node2 11661 1726882374.33507: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000007 11661 1726882374.33510: WORKER PROCESS EXITING 11661 1726882374.33528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882374.33713: done with get_vars() 11661 1726882374.33726: done queuing things up, now waiting for results queue to drain 11661 1726882374.33728: results queue empty 11661 1726882374.33729: checking for any_errors_fatal 11661 1726882374.33731: done checking for any_errors_fatal 11661 1726882374.33732: checking for max_fail_percentage 11661 1726882374.33733: done checking for max_fail_percentage 11661 1726882374.33734: checking to see if all hosts have failed and the running result is not ok 11661 1726882374.33734: done checking to see if all hosts have failed 11661 1726882374.33735: getting the remaining hosts for this loop 11661 1726882374.33736: done getting the remaining hosts for this loop 11661 1726882374.33739: getting the next task for host managed_node2 11661 1726882374.33742: done getting next task for host managed_node2 11661 1726882374.33744: ^ task is: TASK: meta (flush_handlers) 11661 1726882374.33745: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882374.33756: getting variables 11661 1726882374.33757: in VariableManager get_vars() 11661 1726882374.33767: Calling all_inventory to load vars for managed_node2 11661 1726882374.33769: Calling groups_inventory to load vars for managed_node2 11661 1726882374.33771: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882374.33775: Calling all_plugins_play to load vars for managed_node2 11661 1726882374.33777: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882374.33780: Calling groups_plugins_play to load vars for managed_node2 11661 1726882374.33930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882374.34116: done with get_vars() 11661 1726882374.34122: done getting variables 11661 1726882374.34164: in VariableManager get_vars() 11661 1726882374.34172: Calling all_inventory to load vars for managed_node2 11661 1726882374.34174: Calling groups_inventory to load vars for managed_node2 11661 1726882374.34176: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882374.34180: Calling all_plugins_play to load vars for managed_node2 11661 1726882374.34182: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882374.34184: Calling groups_plugins_play to load vars for managed_node2 11661 1726882374.34322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882374.34523: done with get_vars() 11661 1726882374.34534: done queuing things up, now waiting for results queue to drain 11661 1726882374.34535: results queue empty 11661 1726882374.34536: checking for any_errors_fatal 11661 1726882374.34537: done checking for any_errors_fatal 11661 1726882374.34538: checking for max_fail_percentage 11661 1726882374.34539: done checking for max_fail_percentage 11661 1726882374.34539: checking to see if all hosts have failed and the running result is not ok 11661 1726882374.34540: done checking to see if all hosts have failed 11661 1726882374.34541: getting the remaining hosts for this loop 11661 1726882374.34542: done getting the remaining hosts for this loop 11661 1726882374.34544: getting the next task for host managed_node2 11661 1726882374.34546: done getting next task for host managed_node2 11661 1726882374.34547: ^ task is: None 11661 1726882374.34551: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882374.34552: done queuing things up, now waiting for results queue to drain 11661 1726882374.34553: results queue empty 11661 1726882374.34554: checking for any_errors_fatal 11661 1726882374.34554: done checking for any_errors_fatal 11661 1726882374.34555: checking for max_fail_percentage 11661 1726882374.34556: done checking for max_fail_percentage 11661 1726882374.34557: checking to see if all hosts have failed and the running result is not ok 11661 1726882374.34557: done checking to see if all hosts have failed 11661 1726882374.34559: getting the next task for host managed_node2 11661 1726882374.34561: done getting next task for host managed_node2 11661 1726882374.34562: ^ task is: None 11661 1726882374.34565: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882374.34610: in VariableManager get_vars() 11661 1726882374.34631: done with get_vars() 11661 1726882374.34636: in VariableManager get_vars() 11661 1726882374.34651: done with get_vars() 11661 1726882374.34656: variable 'omit' from source: magic vars 11661 1726882374.34686: in VariableManager get_vars() 11661 1726882374.34702: done with get_vars() 11661 1726882374.34723: variable 'omit' from source: magic vars PLAY [Play for testing bond connection] **************************************** 11661 1726882374.35583: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 11661 1726882374.35610: getting the remaining hosts for this loop 11661 1726882374.35611: done getting the remaining hosts for this loop 11661 1726882374.35614: getting the next task for host managed_node2 11661 1726882374.35617: done getting next task for host managed_node2 11661 1726882374.35619: ^ task is: TASK: Gathering Facts 11661 1726882374.35620: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882374.35622: getting variables 11661 1726882374.35623: in VariableManager get_vars() 11661 1726882374.35636: Calling all_inventory to load vars for managed_node2 11661 1726882374.35638: Calling groups_inventory to load vars for managed_node2 11661 1726882374.35640: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882374.35645: Calling all_plugins_play to load vars for managed_node2 11661 1726882374.35661: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882374.35667: Calling groups_plugins_play to load vars for managed_node2 11661 1726882374.35800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882374.36011: done with get_vars() 11661 1726882374.36019: done getting variables 11661 1726882374.36062: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:3 Friday 20 September 2024 21:32:54 -0400 (0:00:00.054) 0:00:03.074 ****** 11661 1726882374.36109: entering _queue_task() for managed_node2/gather_facts 11661 1726882374.36387: worker is 1 (out of 1 available) 11661 1726882374.36399: exiting _queue_task() for managed_node2/gather_facts 11661 1726882374.36410: done queuing things up, now waiting for results queue to drain 11661 1726882374.36411: waiting for pending results... 11661 1726882374.36953: running TaskExecutor() for managed_node2/TASK: Gathering Facts 11661 1726882374.36980: in run() - task 0e448fcc-3ce9-896b-2321-000000000128 11661 1726882374.37001: variable 'ansible_search_path' from source: unknown 11661 1726882374.37048: calling self._execute() 11661 1726882374.37142: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882374.37158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882374.37176: variable 'omit' from source: magic vars 11661 1726882374.37571: variable 'ansible_distribution_major_version' from source: facts 11661 1726882374.37589: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882374.37605: variable 'omit' from source: magic vars 11661 1726882374.37632: variable 'omit' from source: magic vars 11661 1726882374.37676: variable 'omit' from source: magic vars 11661 1726882374.37727: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882374.37770: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882374.37799: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882374.37827: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882374.37843: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882374.37878: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882374.37888: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882374.37896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882374.38003: Set connection var ansible_connection to ssh 11661 1726882374.38015: Set connection var ansible_pipelining to False 11661 1726882374.38025: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882374.38043: Set connection var ansible_timeout to 10 11661 1726882374.38050: Set connection var ansible_shell_type to sh 11661 1726882374.38065: Set connection var ansible_shell_executable to /bin/sh 11661 1726882374.38090: variable 'ansible_shell_executable' from source: unknown 11661 1726882374.38098: variable 'ansible_connection' from source: unknown 11661 1726882374.38106: variable 'ansible_module_compression' from source: unknown 11661 1726882374.38113: variable 'ansible_shell_type' from source: unknown 11661 1726882374.38120: variable 'ansible_shell_executable' from source: unknown 11661 1726882374.38126: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882374.38133: variable 'ansible_pipelining' from source: unknown 11661 1726882374.38145: variable 'ansible_timeout' from source: unknown 11661 1726882374.38152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882374.38333: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882374.38351: variable 'omit' from source: magic vars 11661 1726882374.38369: starting attempt loop 11661 1726882374.38377: running the handler 11661 1726882374.38398: variable 'ansible_facts' from source: unknown 11661 1726882374.38421: _low_level_execute_command(): starting 11661 1726882374.38435: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882374.39207: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882374.39223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882374.39244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882374.39268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882374.39312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882374.39325: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882374.39345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882374.39366: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882374.39377: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882374.39387: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882374.39397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882374.39408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882374.39420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882374.39429: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882374.39437: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882374.39452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882374.39527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882374.39548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882374.39568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882374.39707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882374.41387: stdout chunk (state=3): >>>/root <<< 11661 1726882374.41493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882374.41575: stderr chunk (state=3): >>><<< 11661 1726882374.41579: stdout chunk (state=3): >>><<< 11661 1726882374.41688: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882374.41691: _low_level_execute_command(): starting 11661 1726882374.41694: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882374.4159925-11856-6220956690331 `" && echo ansible-tmp-1726882374.4159925-11856-6220956690331="` echo /root/.ansible/tmp/ansible-tmp-1726882374.4159925-11856-6220956690331 `" ) && sleep 0' 11661 1726882374.42672: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882374.42690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882374.42715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882374.42754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882374.42801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882374.42818: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882374.42832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882374.42857: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882374.42871: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882374.42883: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882374.42895: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882374.42909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882374.42932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882374.42945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882374.42967: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882374.42983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882374.43071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882374.43094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882374.43110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882374.43251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882374.45279: stdout chunk (state=3): >>>ansible-tmp-1726882374.4159925-11856-6220956690331=/root/.ansible/tmp/ansible-tmp-1726882374.4159925-11856-6220956690331 <<< 11661 1726882374.45380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882374.45383: stdout chunk (state=3): >>><<< 11661 1726882374.45392: stderr chunk (state=3): >>><<< 11661 1726882374.45413: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882374.4159925-11856-6220956690331=/root/.ansible/tmp/ansible-tmp-1726882374.4159925-11856-6220956690331 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882374.45443: variable 'ansible_module_compression' from source: unknown 11661 1726882374.45809: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11661 1726882374.45813: variable 'ansible_facts' from source: unknown 11661 1726882374.45815: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882374.4159925-11856-6220956690331/AnsiballZ_setup.py 11661 1726882374.46070: Sending initial data 11661 1726882374.46078: Sent initial data (152 bytes) 11661 1726882374.47061: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882374.47080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882374.47098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882374.47124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882374.47171: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882374.47185: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882374.47200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882374.47230: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882374.47244: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882374.47260: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882374.47277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882374.47291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882374.47307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882374.47323: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882374.47341: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882374.47360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882374.47437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882374.47472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882374.47489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882374.47624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882374.49486: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882374.49585: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882374.49690: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpb016mugi /root/.ansible/tmp/ansible-tmp-1726882374.4159925-11856-6220956690331/AnsiballZ_setup.py <<< 11661 1726882374.49787: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882374.52416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882374.52683: stderr chunk (state=3): >>><<< 11661 1726882374.52686: stdout chunk (state=3): >>><<< 11661 1726882374.52689: done transferring module to remote 11661 1726882374.52695: _low_level_execute_command(): starting 11661 1726882374.52697: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882374.4159925-11856-6220956690331/ /root/.ansible/tmp/ansible-tmp-1726882374.4159925-11856-6220956690331/AnsiballZ_setup.py && sleep 0' 11661 1726882374.53293: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882374.53307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882374.53320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882374.53335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882374.53383: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882374.53394: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882374.53408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882374.53427: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882374.53439: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882374.53451: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882374.53472: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882374.53487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882374.53503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882374.53515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882374.53526: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882374.53539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882374.53622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882374.53645: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882374.53662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882374.53795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882374.55578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882374.55647: stderr chunk (state=3): >>><<< 11661 1726882374.55651: stdout chunk (state=3): >>><<< 11661 1726882374.55755: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882374.55759: _low_level_execute_command(): starting 11661 1726882374.55762: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882374.4159925-11856-6220956690331/AnsiballZ_setup.py && sleep 0' 11661 1726882374.56441: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882374.56459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882374.56482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882374.56506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882374.56558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882374.56573: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882374.56588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882374.56614: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882374.56642: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882374.56675: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882374.56755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882374.56995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882374.56998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882374.57000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882374.57118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882375.08359: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective<<< 11661 1726882375.08428: stdout chunk (state=3): >>>_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "54", "epoch": "1726882374", "epoch_int": "1726882374", "date": "2024-09-20", "time": "21:32:54", "iso8601_micro": "2024-09-21T01:32:54.823246Z", "iso8601": "2024-09-21T01:32:54Z", "iso8601_basic": "20240920T213254823246", "iso8601_basic_short": "20240920T213254", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.37, "5m": 0.32, "15m": 0.15}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(<<< 11661 1726882375.08433: stdout chunk (state=3): >>>R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2804, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 728, "free": 2804}, "nocache": {"free": 3260, "used": 272}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 313, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239464448, "block_size": 4096, "block_total": 65519355, "block_available": 64511588, "block_used": 1007767, "inode_total": 131071472, "inode_available": 130998721, "inode_used": 72751, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11661 1726882375.10086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882375.10089: stdout chunk (state=3): >>><<< 11661 1726882375.10091: stderr chunk (state=3): >>><<< 11661 1726882375.10554: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "54", "epoch": "1726882374", "epoch_int": "1726882374", "date": "2024-09-20", "time": "21:32:54", "iso8601_micro": "2024-09-21T01:32:54.823246Z", "iso8601": "2024-09-21T01:32:54Z", "iso8601_basic": "20240920T213254823246", "iso8601_basic_short": "20240920T213254", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.37, "5m": 0.32, "15m": 0.15}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2804, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 728, "free": 2804}, "nocache": {"free": 3260, "used": 272}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 313, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264239464448, "block_size": 4096, "block_total": 65519355, "block_available": 64511588, "block_used": 1007767, "inode_total": 131071472, "inode_available": 130998721, "inode_used": 72751, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882375.10567: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882374.4159925-11856-6220956690331/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882375.10570: _low_level_execute_command(): starting 11661 1726882375.10572: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882374.4159925-11856-6220956690331/ > /dev/null 2>&1 && sleep 0' 11661 1726882375.11107: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882375.11118: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882375.11130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882375.11146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882375.11186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882375.11196: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882375.11208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882375.11222: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882375.11231: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882375.11239: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882375.11249: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882375.11260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882375.11277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882375.11289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882375.11298: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882375.11308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882375.11383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882375.11405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882375.11418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882375.11541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882375.13429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882375.13433: stdout chunk (state=3): >>><<< 11661 1726882375.13435: stderr chunk (state=3): >>><<< 11661 1726882375.14068: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882375.14072: handler run complete 11661 1726882375.14074: variable 'ansible_facts' from source: unknown 11661 1726882375.14076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882375.14079: variable 'ansible_facts' from source: unknown 11661 1726882375.14082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882375.14209: attempt loop complete, returning result 11661 1726882375.14218: _execute() done 11661 1726882375.14227: dumping result to json 11661 1726882375.14261: done dumping result, returning 11661 1726882375.14276: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0e448fcc-3ce9-896b-2321-000000000128] 11661 1726882375.14287: sending task result for task 0e448fcc-3ce9-896b-2321-000000000128 ok: [managed_node2] 11661 1726882375.14879: no more pending results, returning what we have 11661 1726882375.14883: results queue empty 11661 1726882375.14884: checking for any_errors_fatal 11661 1726882375.14886: done checking for any_errors_fatal 11661 1726882375.14886: checking for max_fail_percentage 11661 1726882375.14888: done checking for max_fail_percentage 11661 1726882375.14889: checking to see if all hosts have failed and the running result is not ok 11661 1726882375.14890: done checking to see if all hosts have failed 11661 1726882375.14890: getting the remaining hosts for this loop 11661 1726882375.14892: done getting the remaining hosts for this loop 11661 1726882375.14895: getting the next task for host managed_node2 11661 1726882375.14902: done getting next task for host managed_node2 11661 1726882375.14904: ^ task is: TASK: meta (flush_handlers) 11661 1726882375.14905: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882375.14910: getting variables 11661 1726882375.14912: in VariableManager get_vars() 11661 1726882375.14949: Calling all_inventory to load vars for managed_node2 11661 1726882375.14952: Calling groups_inventory to load vars for managed_node2 11661 1726882375.14955: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882375.14967: Calling all_plugins_play to load vars for managed_node2 11661 1726882375.14969: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882375.14973: Calling groups_plugins_play to load vars for managed_node2 11661 1726882375.15127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882375.15329: done with get_vars() 11661 1726882375.15340: done getting variables 11661 1726882375.15574: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000128 11661 1726882375.15577: WORKER PROCESS EXITING 11661 1726882375.15619: in VariableManager get_vars() 11661 1726882375.15632: Calling all_inventory to load vars for managed_node2 11661 1726882375.15634: Calling groups_inventory to load vars for managed_node2 11661 1726882375.15636: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882375.15641: Calling all_plugins_play to load vars for managed_node2 11661 1726882375.15643: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882375.15655: Calling groups_plugins_play to load vars for managed_node2 11661 1726882375.15815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882375.15943: done with get_vars() 11661 1726882375.15952: done queuing things up, now waiting for results queue to drain 11661 1726882375.15954: results queue empty 11661 1726882375.15954: checking for any_errors_fatal 11661 1726882375.15956: done checking for any_errors_fatal 11661 1726882375.15956: checking for max_fail_percentage 11661 1726882375.15957: done checking for max_fail_percentage 11661 1726882375.15957: checking to see if all hosts have failed and the running result is not ok 11661 1726882375.15958: done checking to see if all hosts have failed 11661 1726882375.15958: getting the remaining hosts for this loop 11661 1726882375.15959: done getting the remaining hosts for this loop 11661 1726882375.15960: getting the next task for host managed_node2 11661 1726882375.15967: done getting next task for host managed_node2 11661 1726882375.15969: ^ task is: TASK: INIT Prepare setup 11661 1726882375.15970: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882375.15971: getting variables 11661 1726882375.15971: in VariableManager get_vars() 11661 1726882375.15981: Calling all_inventory to load vars for managed_node2 11661 1726882375.15982: Calling groups_inventory to load vars for managed_node2 11661 1726882375.15983: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882375.15986: Calling all_plugins_play to load vars for managed_node2 11661 1726882375.15987: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882375.15989: Calling groups_plugins_play to load vars for managed_node2 11661 1726882375.16071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882375.16177: done with get_vars() 11661 1726882375.16183: done getting variables 11661 1726882375.16234: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:15 Friday 20 September 2024 21:32:55 -0400 (0:00:00.801) 0:00:03.875 ****** 11661 1726882375.16253: entering _queue_task() for managed_node2/debug 11661 1726882375.16255: Creating lock for debug 11661 1726882375.16443: worker is 1 (out of 1 available) 11661 1726882375.16459: exiting _queue_task() for managed_node2/debug 11661 1726882375.16473: done queuing things up, now waiting for results queue to drain 11661 1726882375.16475: waiting for pending results... 11661 1726882375.16621: running TaskExecutor() for managed_node2/TASK: INIT Prepare setup 11661 1726882375.16679: in run() - task 0e448fcc-3ce9-896b-2321-00000000000b 11661 1726882375.16689: variable 'ansible_search_path' from source: unknown 11661 1726882375.16717: calling self._execute() 11661 1726882375.16779: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882375.16793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882375.16801: variable 'omit' from source: magic vars 11661 1726882375.17084: variable 'ansible_distribution_major_version' from source: facts 11661 1726882375.17094: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882375.17099: variable 'omit' from source: magic vars 11661 1726882375.17114: variable 'omit' from source: magic vars 11661 1726882375.17137: variable 'omit' from source: magic vars 11661 1726882375.17172: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882375.17198: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882375.17214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882375.17226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882375.17236: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882375.17262: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882375.17267: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882375.17269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882375.17334: Set connection var ansible_connection to ssh 11661 1726882375.17337: Set connection var ansible_pipelining to False 11661 1726882375.17343: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882375.17353: Set connection var ansible_timeout to 10 11661 1726882375.17355: Set connection var ansible_shell_type to sh 11661 1726882375.17363: Set connection var ansible_shell_executable to /bin/sh 11661 1726882375.17381: variable 'ansible_shell_executable' from source: unknown 11661 1726882375.17384: variable 'ansible_connection' from source: unknown 11661 1726882375.17390: variable 'ansible_module_compression' from source: unknown 11661 1726882375.17393: variable 'ansible_shell_type' from source: unknown 11661 1726882375.17396: variable 'ansible_shell_executable' from source: unknown 11661 1726882375.17398: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882375.17402: variable 'ansible_pipelining' from source: unknown 11661 1726882375.17404: variable 'ansible_timeout' from source: unknown 11661 1726882375.17408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882375.17525: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882375.17535: variable 'omit' from source: magic vars 11661 1726882375.17592: starting attempt loop 11661 1726882375.17597: running the handler 11661 1726882375.17630: handler run complete 11661 1726882375.17658: attempt loop complete, returning result 11661 1726882375.17669: _execute() done 11661 1726882375.17676: dumping result to json 11661 1726882375.17683: done dumping result, returning 11661 1726882375.17694: done running TaskExecutor() for managed_node2/TASK: INIT Prepare setup [0e448fcc-3ce9-896b-2321-00000000000b] 11661 1726882375.17703: sending task result for task 0e448fcc-3ce9-896b-2321-00000000000b 11661 1726882375.17808: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000000b 11661 1726882375.17817: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ################################################## 11661 1726882375.17874: no more pending results, returning what we have 11661 1726882375.17878: results queue empty 11661 1726882375.17879: checking for any_errors_fatal 11661 1726882375.17881: done checking for any_errors_fatal 11661 1726882375.17881: checking for max_fail_percentage 11661 1726882375.17883: done checking for max_fail_percentage 11661 1726882375.17884: checking to see if all hosts have failed and the running result is not ok 11661 1726882375.17885: done checking to see if all hosts have failed 11661 1726882375.17886: getting the remaining hosts for this loop 11661 1726882375.17887: done getting the remaining hosts for this loop 11661 1726882375.17891: getting the next task for host managed_node2 11661 1726882375.17899: done getting next task for host managed_node2 11661 1726882375.17903: ^ task is: TASK: Install dnsmasq 11661 1726882375.17906: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882375.17909: getting variables 11661 1726882375.17911: in VariableManager get_vars() 11661 1726882375.17960: Calling all_inventory to load vars for managed_node2 11661 1726882375.17965: Calling groups_inventory to load vars for managed_node2 11661 1726882375.17968: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882375.17980: Calling all_plugins_play to load vars for managed_node2 11661 1726882375.17983: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882375.17990: Calling groups_plugins_play to load vars for managed_node2 11661 1726882375.18277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882375.18609: done with get_vars() 11661 1726882375.18617: done getting variables 11661 1726882375.18669: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:32:55 -0400 (0:00:00.024) 0:00:03.900 ****** 11661 1726882375.18700: entering _queue_task() for managed_node2/package 11661 1726882375.18912: worker is 1 (out of 1 available) 11661 1726882375.18925: exiting _queue_task() for managed_node2/package 11661 1726882375.18936: done queuing things up, now waiting for results queue to drain 11661 1726882375.18938: waiting for pending results... 11661 1726882375.19179: running TaskExecutor() for managed_node2/TASK: Install dnsmasq 11661 1726882375.19279: in run() - task 0e448fcc-3ce9-896b-2321-00000000000f 11661 1726882375.19297: variable 'ansible_search_path' from source: unknown 11661 1726882375.19305: variable 'ansible_search_path' from source: unknown 11661 1726882375.19344: calling self._execute() 11661 1726882375.19427: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882375.19436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882375.19446: variable 'omit' from source: magic vars 11661 1726882375.19833: variable 'ansible_distribution_major_version' from source: facts 11661 1726882375.19848: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882375.19860: variable 'omit' from source: magic vars 11661 1726882375.19931: variable 'omit' from source: magic vars 11661 1726882375.20127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882375.21740: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882375.21788: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882375.21813: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882375.21839: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882375.21865: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882375.21940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882375.22098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882375.22102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882375.22104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882375.22106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882375.22146: variable '__network_is_ostree' from source: set_fact 11661 1726882375.22160: variable 'omit' from source: magic vars 11661 1726882375.22194: variable 'omit' from source: magic vars 11661 1726882375.22224: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882375.22259: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882375.22284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882375.22305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882375.22320: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882375.22355: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882375.22366: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882375.22375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882375.22477: Set connection var ansible_connection to ssh 11661 1726882375.22488: Set connection var ansible_pipelining to False 11661 1726882375.22499: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882375.22511: Set connection var ansible_timeout to 10 11661 1726882375.22518: Set connection var ansible_shell_type to sh 11661 1726882375.22529: Set connection var ansible_shell_executable to /bin/sh 11661 1726882375.22559: variable 'ansible_shell_executable' from source: unknown 11661 1726882375.22570: variable 'ansible_connection' from source: unknown 11661 1726882375.22578: variable 'ansible_module_compression' from source: unknown 11661 1726882375.22584: variable 'ansible_shell_type' from source: unknown 11661 1726882375.22590: variable 'ansible_shell_executable' from source: unknown 11661 1726882375.22595: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882375.22605: variable 'ansible_pipelining' from source: unknown 11661 1726882375.22611: variable 'ansible_timeout' from source: unknown 11661 1726882375.22618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882375.22715: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882375.22731: variable 'omit' from source: magic vars 11661 1726882375.22740: starting attempt loop 11661 1726882375.22746: running the handler 11661 1726882375.22760: variable 'ansible_facts' from source: unknown 11661 1726882375.22769: variable 'ansible_facts' from source: unknown 11661 1726882375.22806: _low_level_execute_command(): starting 11661 1726882375.22818: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882375.23546: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882375.23568: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882375.23585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882375.23605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882375.23654: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882375.23669: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882375.23684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882375.23703: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882375.23715: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882375.23727: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882375.23739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882375.23756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882375.23775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882375.23788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882375.23800: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882375.23814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882375.23898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882375.23931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882375.23948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882375.24092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882375.25750: stdout chunk (state=3): >>>/root <<< 11661 1726882375.25880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882375.25958: stderr chunk (state=3): >>><<< 11661 1726882375.25961: stdout chunk (state=3): >>><<< 11661 1726882375.26078: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882375.26082: _low_level_execute_command(): starting 11661 1726882375.26085: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882375.2598834-11917-120047057610625 `" && echo ansible-tmp-1726882375.2598834-11917-120047057610625="` echo /root/.ansible/tmp/ansible-tmp-1726882375.2598834-11917-120047057610625 `" ) && sleep 0' 11661 1726882375.26688: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882375.26702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882375.26717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882375.26734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882375.26784: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882375.26796: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882375.26810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882375.26827: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882375.26838: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882375.26854: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882375.26869: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882375.26883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882375.26899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882375.26912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882375.26922: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882375.26935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882375.27017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882375.27038: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882375.27053: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882375.27196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882375.29068: stdout chunk (state=3): >>>ansible-tmp-1726882375.2598834-11917-120047057610625=/root/.ansible/tmp/ansible-tmp-1726882375.2598834-11917-120047057610625 <<< 11661 1726882375.29180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882375.29252: stderr chunk (state=3): >>><<< 11661 1726882375.29267: stdout chunk (state=3): >>><<< 11661 1726882375.29388: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882375.2598834-11917-120047057610625=/root/.ansible/tmp/ansible-tmp-1726882375.2598834-11917-120047057610625 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882375.29392: variable 'ansible_module_compression' from source: unknown 11661 1726882375.29509: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 11661 1726882375.29513: ANSIBALLZ: Acquiring lock 11661 1726882375.29516: ANSIBALLZ: Lock acquired: 139652576276224 11661 1726882375.29518: ANSIBALLZ: Creating module 11661 1726882375.53411: ANSIBALLZ: Writing module into payload 11661 1726882375.53679: ANSIBALLZ: Writing module 11661 1726882375.53703: ANSIBALLZ: Renaming module 11661 1726882375.53708: ANSIBALLZ: Done creating module 11661 1726882375.53725: variable 'ansible_facts' from source: unknown 11661 1726882375.53811: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882375.2598834-11917-120047057610625/AnsiballZ_dnf.py 11661 1726882375.53957: Sending initial data 11661 1726882375.53960: Sent initial data (152 bytes) 11661 1726882375.54945: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882375.54955: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882375.54967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882375.54985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882375.55022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882375.55029: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882375.55039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882375.55054: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882375.55060: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882375.55071: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882375.55080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882375.55094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882375.55104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882375.55112: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882375.55118: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882375.55128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882375.55217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882375.55221: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882375.55231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882375.55419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882375.57248: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882375.57347: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882375.57450: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmp28zg3j2z /root/.ansible/tmp/ansible-tmp-1726882375.2598834-11917-120047057610625/AnsiballZ_dnf.py <<< 11661 1726882375.57551: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882375.59378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882375.59533: stderr chunk (state=3): >>><<< 11661 1726882375.59536: stdout chunk (state=3): >>><<< 11661 1726882375.59556: done transferring module to remote 11661 1726882375.59575: _low_level_execute_command(): starting 11661 1726882375.59579: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882375.2598834-11917-120047057610625/ /root/.ansible/tmp/ansible-tmp-1726882375.2598834-11917-120047057610625/AnsiballZ_dnf.py && sleep 0' 11661 1726882375.60937: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882375.60940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882375.60989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882375.60995: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882375.61008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882375.61021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 11661 1726882375.61026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882375.61102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882375.61112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882375.61127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882375.61253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882375.63489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882375.63970: stderr chunk (state=3): >>><<< 11661 1726882375.63973: stdout chunk (state=3): >>><<< 11661 1726882375.63976: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882375.63979: _low_level_execute_command(): starting 11661 1726882375.63981: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882375.2598834-11917-120047057610625/AnsiballZ_dnf.py && sleep 0' 11661 1726882375.64947: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882375.64957: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882375.64968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882375.64982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882375.65020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882375.65027: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882375.65037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882375.65054: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882375.65057: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882375.65065: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882375.65076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882375.65085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882375.65095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882375.65102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882375.65108: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882375.65117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882375.65188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882375.65279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882375.65282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882375.65419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882377.94818: stdout chunk (state=3): >>> {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.85-16.el9.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11661 1726882378.01859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882378.01941: stderr chunk (state=3): >>><<< 11661 1726882378.01945: stdout chunk (state=3): >>><<< 11661 1726882378.02098: _low_level_execute_command() done: rc=0, stdout= {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.85-16.el9.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882378.02102: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882375.2598834-11917-120047057610625/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882378.02105: _low_level_execute_command(): starting 11661 1726882378.02108: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882375.2598834-11917-120047057610625/ > /dev/null 2>&1 && sleep 0' 11661 1726882378.02681: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882378.02697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882378.02711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882378.02728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882378.02775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882378.02787: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882378.02800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882378.02817: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882378.02828: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882378.02839: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882378.02855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882378.02875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882378.02890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882378.02901: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882378.02913: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882378.02925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882378.03005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882378.03021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882378.03035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882378.03169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882378.05042: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882378.05125: stderr chunk (state=3): >>><<< 11661 1726882378.05129: stdout chunk (state=3): >>><<< 11661 1726882378.05273: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882378.05280: handler run complete 11661 1726882378.05381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11661 1726882378.05539: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11661 1726882378.05588: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11661 1726882378.05630: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11661 1726882378.05670: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11661 1726882378.05754: variable '__install_status' from source: unknown 11661 1726882378.05781: Evaluated conditional (__install_status is success): True 11661 1726882378.05802: attempt loop complete, returning result 11661 1726882378.05811: _execute() done 11661 1726882378.05821: dumping result to json 11661 1726882378.05831: done dumping result, returning 11661 1726882378.05844: done running TaskExecutor() for managed_node2/TASK: Install dnsmasq [0e448fcc-3ce9-896b-2321-00000000000f] 11661 1726882378.05856: sending task result for task 0e448fcc-3ce9-896b-2321-00000000000f changed: [managed_node2] => { "attempts": 1, "changed": true, "rc": 0, "results": [ "Installed: dnsmasq-2.85-16.el9.x86_64" ] } 11661 1726882378.06054: no more pending results, returning what we have 11661 1726882378.06057: results queue empty 11661 1726882378.06057: checking for any_errors_fatal 11661 1726882378.06066: done checking for any_errors_fatal 11661 1726882378.06067: checking for max_fail_percentage 11661 1726882378.06068: done checking for max_fail_percentage 11661 1726882378.06069: checking to see if all hosts have failed and the running result is not ok 11661 1726882378.06070: done checking to see if all hosts have failed 11661 1726882378.06070: getting the remaining hosts for this loop 11661 1726882378.06072: done getting the remaining hosts for this loop 11661 1726882378.06075: getting the next task for host managed_node2 11661 1726882378.06082: done getting next task for host managed_node2 11661 1726882378.06084: ^ task is: TASK: Install pgrep, sysctl 11661 1726882378.06087: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882378.06089: getting variables 11661 1726882378.06091: in VariableManager get_vars() 11661 1726882378.06126: Calling all_inventory to load vars for managed_node2 11661 1726882378.06129: Calling groups_inventory to load vars for managed_node2 11661 1726882378.06131: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882378.06142: Calling all_plugins_play to load vars for managed_node2 11661 1726882378.06144: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882378.06148: Calling groups_plugins_play to load vars for managed_node2 11661 1726882378.06306: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000000f 11661 1726882378.06310: WORKER PROCESS EXITING 11661 1726882378.06323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882378.06541: done with get_vars() 11661 1726882378.06552: done getting variables 11661 1726882378.06612: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 21:32:58 -0400 (0:00:02.879) 0:00:06.779 ****** 11661 1726882378.06648: entering _queue_task() for managed_node2/package 11661 1726882378.06904: worker is 1 (out of 1 available) 11661 1726882378.06917: exiting _queue_task() for managed_node2/package 11661 1726882378.06930: done queuing things up, now waiting for results queue to drain 11661 1726882378.06931: waiting for pending results... 11661 1726882378.07188: running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl 11661 1726882378.07297: in run() - task 0e448fcc-3ce9-896b-2321-000000000010 11661 1726882378.07316: variable 'ansible_search_path' from source: unknown 11661 1726882378.07324: variable 'ansible_search_path' from source: unknown 11661 1726882378.07362: calling self._execute() 11661 1726882378.07453: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882378.07469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882378.07494: variable 'omit' from source: magic vars 11661 1726882378.07852: variable 'ansible_distribution_major_version' from source: facts 11661 1726882378.07870: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882378.07989: variable 'ansible_os_family' from source: facts 11661 1726882378.07999: Evaluated conditional (ansible_os_family == 'RedHat'): True 11661 1726882378.08180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11661 1726882378.08507: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11661 1726882378.08554: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11661 1726882378.08600: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11661 1726882378.08636: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11661 1726882378.08719: variable 'ansible_distribution_major_version' from source: facts 11661 1726882378.08736: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 11661 1726882378.08744: when evaluation is False, skipping this task 11661 1726882378.08750: _execute() done 11661 1726882378.08757: dumping result to json 11661 1726882378.08767: done dumping result, returning 11661 1726882378.08780: done running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl [0e448fcc-3ce9-896b-2321-000000000010] 11661 1726882378.08796: sending task result for task 0e448fcc-3ce9-896b-2321-000000000010 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 11661 1726882378.08956: no more pending results, returning what we have 11661 1726882378.08960: results queue empty 11661 1726882378.08961: checking for any_errors_fatal 11661 1726882378.08971: done checking for any_errors_fatal 11661 1726882378.08973: checking for max_fail_percentage 11661 1726882378.08974: done checking for max_fail_percentage 11661 1726882378.08976: checking to see if all hosts have failed and the running result is not ok 11661 1726882378.08976: done checking to see if all hosts have failed 11661 1726882378.08977: getting the remaining hosts for this loop 11661 1726882378.08979: done getting the remaining hosts for this loop 11661 1726882378.08982: getting the next task for host managed_node2 11661 1726882378.08989: done getting next task for host managed_node2 11661 1726882378.08992: ^ task is: TASK: Install pgrep, sysctl 11661 1726882378.08995: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882378.08999: getting variables 11661 1726882378.09001: in VariableManager get_vars() 11661 1726882378.09040: Calling all_inventory to load vars for managed_node2 11661 1726882378.09043: Calling groups_inventory to load vars for managed_node2 11661 1726882378.09045: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882378.09057: Calling all_plugins_play to load vars for managed_node2 11661 1726882378.09060: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882378.09066: Calling groups_plugins_play to load vars for managed_node2 11661 1726882378.09309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882378.09534: done with get_vars() 11661 1726882378.09552: done getting variables 11661 1726882378.09656: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000010 11661 1726882378.09659: WORKER PROCESS EXITING 11661 1726882378.09696: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 21:32:58 -0400 (0:00:00.031) 0:00:06.811 ****** 11661 1726882378.09761: entering _queue_task() for managed_node2/package 11661 1726882378.10261: worker is 1 (out of 1 available) 11661 1726882378.10274: exiting _queue_task() for managed_node2/package 11661 1726882378.10285: done queuing things up, now waiting for results queue to drain 11661 1726882378.10287: waiting for pending results... 11661 1726882378.10544: running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl 11661 1726882378.10662: in run() - task 0e448fcc-3ce9-896b-2321-000000000011 11661 1726882378.10684: variable 'ansible_search_path' from source: unknown 11661 1726882378.10695: variable 'ansible_search_path' from source: unknown 11661 1726882378.10736: calling self._execute() 11661 1726882378.10819: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882378.10830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882378.10847: variable 'omit' from source: magic vars 11661 1726882378.11260: variable 'ansible_distribution_major_version' from source: facts 11661 1726882378.11280: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882378.11403: variable 'ansible_os_family' from source: facts 11661 1726882378.11413: Evaluated conditional (ansible_os_family == 'RedHat'): True 11661 1726882378.11593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11661 1726882378.11859: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11661 1726882378.11910: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11661 1726882378.11951: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11661 1726882378.11991: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11661 1726882378.12073: variable 'ansible_distribution_major_version' from source: facts 11661 1726882378.12089: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 11661 1726882378.12098: variable 'omit' from source: magic vars 11661 1726882378.12152: variable 'omit' from source: magic vars 11661 1726882378.12312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882378.14540: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882378.14620: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882378.14665: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882378.14706: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882378.14740: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882378.14835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882378.14873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882378.14904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882378.14958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882378.14980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882378.15089: variable '__network_is_ostree' from source: set_fact 11661 1726882378.15099: variable 'omit' from source: magic vars 11661 1726882378.15135: variable 'omit' from source: magic vars 11661 1726882378.15168: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882378.15202: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882378.15224: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882378.15249: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882378.15265: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882378.15300: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882378.15308: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882378.15315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882378.15423: Set connection var ansible_connection to ssh 11661 1726882378.15433: Set connection var ansible_pipelining to False 11661 1726882378.15442: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882378.15459: Set connection var ansible_timeout to 10 11661 1726882378.15468: Set connection var ansible_shell_type to sh 11661 1726882378.15480: Set connection var ansible_shell_executable to /bin/sh 11661 1726882378.15510: variable 'ansible_shell_executable' from source: unknown 11661 1726882378.15518: variable 'ansible_connection' from source: unknown 11661 1726882378.15524: variable 'ansible_module_compression' from source: unknown 11661 1726882378.15530: variable 'ansible_shell_type' from source: unknown 11661 1726882378.15536: variable 'ansible_shell_executable' from source: unknown 11661 1726882378.15542: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882378.15549: variable 'ansible_pipelining' from source: unknown 11661 1726882378.15555: variable 'ansible_timeout' from source: unknown 11661 1726882378.15568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882378.15672: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882378.15687: variable 'omit' from source: magic vars 11661 1726882378.15696: starting attempt loop 11661 1726882378.15704: running the handler 11661 1726882378.15717: variable 'ansible_facts' from source: unknown 11661 1726882378.15725: variable 'ansible_facts' from source: unknown 11661 1726882378.15760: _low_level_execute_command(): starting 11661 1726882378.15773: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882378.16540: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882378.16557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882378.16576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882378.16597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882378.16640: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882378.16656: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882378.16672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882378.16692: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882378.16712: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882378.16722: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882378.16733: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882378.16745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882378.16761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882378.16778: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882378.16788: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882378.16803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882378.16884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882378.16908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882378.16926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882378.17065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882378.18722: stdout chunk (state=3): >>>/root <<< 11661 1726882378.18829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882378.18881: stderr chunk (state=3): >>><<< 11661 1726882378.18884: stdout chunk (state=3): >>><<< 11661 1726882378.18906: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882378.18916: _low_level_execute_command(): starting 11661 1726882378.18921: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882378.189056-12011-23704428249174 `" && echo ansible-tmp-1726882378.189056-12011-23704428249174="` echo /root/.ansible/tmp/ansible-tmp-1726882378.189056-12011-23704428249174 `" ) && sleep 0' 11661 1726882378.19368: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882378.19375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882378.19406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882378.19412: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882378.19420: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882378.19425: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882378.19432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882378.19442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882378.19451: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882378.19505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882378.19520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882378.19532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882378.19642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882378.21526: stdout chunk (state=3): >>>ansible-tmp-1726882378.189056-12011-23704428249174=/root/.ansible/tmp/ansible-tmp-1726882378.189056-12011-23704428249174 <<< 11661 1726882378.21634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882378.21684: stderr chunk (state=3): >>><<< 11661 1726882378.21687: stdout chunk (state=3): >>><<< 11661 1726882378.21719: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882378.189056-12011-23704428249174=/root/.ansible/tmp/ansible-tmp-1726882378.189056-12011-23704428249174 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882378.21725: variable 'ansible_module_compression' from source: unknown 11661 1726882378.21774: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 11661 1726882378.21808: variable 'ansible_facts' from source: unknown 11661 1726882378.21884: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882378.189056-12011-23704428249174/AnsiballZ_dnf.py 11661 1726882378.21991: Sending initial data 11661 1726882378.21995: Sent initial data (150 bytes) 11661 1726882378.22654: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882378.22658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882378.22691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882378.22701: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882378.22706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882378.22723: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882378.22730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882378.22733: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882378.22742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882378.22794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882378.22830: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882378.22832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882378.22960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882378.24751: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11661 1726882378.24766: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11661 1726882378.24773: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 11661 1726882378.24780: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 11661 1726882378.24787: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 11661 1726882378.24794: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 11661 1726882378.24804: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 11661 1726882378.24810: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 11661 1726882378.24817: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882378.24928: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 11661 1726882378.24934: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 11661 1726882378.24941: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 11661 1726882378.25053: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpmqser3y0 /root/.ansible/tmp/ansible-tmp-1726882378.189056-12011-23704428249174/AnsiballZ_dnf.py <<< 11661 1726882378.25159: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882378.26774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882378.26924: stderr chunk (state=3): >>><<< 11661 1726882378.26929: stdout chunk (state=3): >>><<< 11661 1726882378.26955: done transferring module to remote 11661 1726882378.26966: _low_level_execute_command(): starting 11661 1726882378.26973: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882378.189056-12011-23704428249174/ /root/.ansible/tmp/ansible-tmp-1726882378.189056-12011-23704428249174/AnsiballZ_dnf.py && sleep 0' 11661 1726882378.27603: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882378.27610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882378.27621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882378.27635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882378.27676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882378.27683: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882378.27694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882378.27706: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882378.27714: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882378.27720: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882378.27727: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882378.27736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882378.27747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882378.27754: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882378.27760: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882378.27771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882378.27842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882378.27859: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882378.27873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882378.27995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882378.29860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882378.29866: stdout chunk (state=3): >>><<< 11661 1726882378.29873: stderr chunk (state=3): >>><<< 11661 1726882378.29888: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882378.29891: _low_level_execute_command(): starting 11661 1726882378.29897: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882378.189056-12011-23704428249174/AnsiballZ_dnf.py && sleep 0' 11661 1726882378.30455: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882378.30459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882378.30468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882378.30482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882378.30519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882378.30526: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882378.30535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882378.30552: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882378.30556: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882378.30565: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882378.30576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882378.30583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882378.30595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882378.30602: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882378.30608: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882378.30617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882378.30689: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882378.30702: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882378.30712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882378.30840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882379.33407: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11661 1726882379.39066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882379.39075: stdout chunk (state=3): >>><<< 11661 1726882379.39078: stderr chunk (state=3): >>><<< 11661 1726882379.39096: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882379.39127: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882378.189056-12011-23704428249174/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882379.39134: _low_level_execute_command(): starting 11661 1726882379.39139: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882378.189056-12011-23704428249174/ > /dev/null 2>&1 && sleep 0' 11661 1726882379.39709: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882379.39716: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882379.39725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882379.39737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882379.39776: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882379.39784: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882379.39791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882379.39803: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882379.39810: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882379.39816: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882379.39823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882379.39831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882379.39841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882379.39847: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882379.39856: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882379.39866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882379.39932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882379.39946: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882379.39951: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882379.40078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882379.41882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882379.41920: stderr chunk (state=3): >>><<< 11661 1726882379.41923: stdout chunk (state=3): >>><<< 11661 1726882379.41938: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882379.41946: handler run complete 11661 1726882379.41972: attempt loop complete, returning result 11661 1726882379.41975: _execute() done 11661 1726882379.41978: dumping result to json 11661 1726882379.41982: done dumping result, returning 11661 1726882379.41990: done running TaskExecutor() for managed_node2/TASK: Install pgrep, sysctl [0e448fcc-3ce9-896b-2321-000000000011] 11661 1726882379.41994: sending task result for task 0e448fcc-3ce9-896b-2321-000000000011 11661 1726882379.42091: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000011 11661 1726882379.42094: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 11661 1726882379.42163: no more pending results, returning what we have 11661 1726882379.42168: results queue empty 11661 1726882379.42168: checking for any_errors_fatal 11661 1726882379.42174: done checking for any_errors_fatal 11661 1726882379.42175: checking for max_fail_percentage 11661 1726882379.42177: done checking for max_fail_percentage 11661 1726882379.42178: checking to see if all hosts have failed and the running result is not ok 11661 1726882379.42178: done checking to see if all hosts have failed 11661 1726882379.42179: getting the remaining hosts for this loop 11661 1726882379.42181: done getting the remaining hosts for this loop 11661 1726882379.42184: getting the next task for host managed_node2 11661 1726882379.42189: done getting next task for host managed_node2 11661 1726882379.42192: ^ task is: TASK: Create test interfaces 11661 1726882379.42194: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882379.42197: getting variables 11661 1726882379.42198: in VariableManager get_vars() 11661 1726882379.42237: Calling all_inventory to load vars for managed_node2 11661 1726882379.42240: Calling groups_inventory to load vars for managed_node2 11661 1726882379.42241: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882379.42253: Calling all_plugins_play to load vars for managed_node2 11661 1726882379.42255: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882379.42258: Calling groups_plugins_play to load vars for managed_node2 11661 1726882379.42399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882379.42517: done with get_vars() 11661 1726882379.42525: done getting variables 11661 1726882379.42595: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 21:32:59 -0400 (0:00:01.328) 0:00:08.139 ****** 11661 1726882379.42616: entering _queue_task() for managed_node2/shell 11661 1726882379.42618: Creating lock for shell 11661 1726882379.42801: worker is 1 (out of 1 available) 11661 1726882379.42812: exiting _queue_task() for managed_node2/shell 11661 1726882379.42823: done queuing things up, now waiting for results queue to drain 11661 1726882379.42825: waiting for pending results... 11661 1726882379.42974: running TaskExecutor() for managed_node2/TASK: Create test interfaces 11661 1726882379.43037: in run() - task 0e448fcc-3ce9-896b-2321-000000000012 11661 1726882379.43048: variable 'ansible_search_path' from source: unknown 11661 1726882379.43053: variable 'ansible_search_path' from source: unknown 11661 1726882379.43080: calling self._execute() 11661 1726882379.43136: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882379.43140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882379.43151: variable 'omit' from source: magic vars 11661 1726882379.43410: variable 'ansible_distribution_major_version' from source: facts 11661 1726882379.43420: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882379.43425: variable 'omit' from source: magic vars 11661 1726882379.43458: variable 'omit' from source: magic vars 11661 1726882379.43694: variable 'dhcp_interface1' from source: play vars 11661 1726882379.43698: variable 'dhcp_interface2' from source: play vars 11661 1726882379.43724: variable 'omit' from source: magic vars 11661 1726882379.43758: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882379.43832: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882379.43847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882379.43863: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882379.43874: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882379.43896: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882379.43899: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882379.43901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882379.43970: Set connection var ansible_connection to ssh 11661 1726882379.43976: Set connection var ansible_pipelining to False 11661 1726882379.43981: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882379.43987: Set connection var ansible_timeout to 10 11661 1726882379.43990: Set connection var ansible_shell_type to sh 11661 1726882379.43996: Set connection var ansible_shell_executable to /bin/sh 11661 1726882379.44011: variable 'ansible_shell_executable' from source: unknown 11661 1726882379.44014: variable 'ansible_connection' from source: unknown 11661 1726882379.44016: variable 'ansible_module_compression' from source: unknown 11661 1726882379.44018: variable 'ansible_shell_type' from source: unknown 11661 1726882379.44021: variable 'ansible_shell_executable' from source: unknown 11661 1726882379.44023: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882379.44027: variable 'ansible_pipelining' from source: unknown 11661 1726882379.44029: variable 'ansible_timeout' from source: unknown 11661 1726882379.44035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882379.44128: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882379.44135: variable 'omit' from source: magic vars 11661 1726882379.44140: starting attempt loop 11661 1726882379.44144: running the handler 11661 1726882379.44154: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882379.44166: _low_level_execute_command(): starting 11661 1726882379.44174: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882379.44691: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882379.44707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882379.44733: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882379.44755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882379.44759: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 11661 1726882379.44771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882379.44781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882379.44827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882379.44839: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882379.44949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882379.46557: stdout chunk (state=3): >>>/root <<< 11661 1726882379.46722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882379.46725: stdout chunk (state=3): >>><<< 11661 1726882379.46734: stderr chunk (state=3): >>><<< 11661 1726882379.46760: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882379.46771: _low_level_execute_command(): starting 11661 1726882379.46777: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882379.467586-12059-96980598354643 `" && echo ansible-tmp-1726882379.467586-12059-96980598354643="` echo /root/.ansible/tmp/ansible-tmp-1726882379.467586-12059-96980598354643 `" ) && sleep 0' 11661 1726882379.47347: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882379.47351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882379.47389: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882379.47392: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 11661 1726882379.47395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882379.47397: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882379.47442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882379.47446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882379.47555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882379.49450: stdout chunk (state=3): >>>ansible-tmp-1726882379.467586-12059-96980598354643=/root/.ansible/tmp/ansible-tmp-1726882379.467586-12059-96980598354643 <<< 11661 1726882379.49565: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882379.49608: stderr chunk (state=3): >>><<< 11661 1726882379.49613: stdout chunk (state=3): >>><<< 11661 1726882379.49627: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882379.467586-12059-96980598354643=/root/.ansible/tmp/ansible-tmp-1726882379.467586-12059-96980598354643 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882379.49655: variable 'ansible_module_compression' from source: unknown 11661 1726882379.49694: ANSIBALLZ: Using generic lock for ansible.legacy.command 11661 1726882379.49697: ANSIBALLZ: Acquiring lock 11661 1726882379.49699: ANSIBALLZ: Lock acquired: 139652576276224 11661 1726882379.49702: ANSIBALLZ: Creating module 11661 1726882379.61787: ANSIBALLZ: Writing module into payload 11661 1726882379.61904: ANSIBALLZ: Writing module 11661 1726882379.61931: ANSIBALLZ: Renaming module 11661 1726882379.61950: ANSIBALLZ: Done creating module 11661 1726882379.61975: variable 'ansible_facts' from source: unknown 11661 1726882379.62047: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882379.467586-12059-96980598354643/AnsiballZ_command.py 11661 1726882379.62200: Sending initial data 11661 1726882379.62203: Sent initial data (154 bytes) 11661 1726882379.63145: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882379.63157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882379.63173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882379.63188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882379.63225: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882379.63237: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882379.63251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882379.63272: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882379.63286: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882379.63298: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882379.63311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882379.63325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882379.63343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882379.63355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882379.63368: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882379.63381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882379.63457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882379.63483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882379.63499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882379.63634: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882379.65474: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882379.65565: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882379.65666: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmples1pzyp /root/.ansible/tmp/ansible-tmp-1726882379.467586-12059-96980598354643/AnsiballZ_command.py <<< 11661 1726882379.65759: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882379.67392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882379.67513: stderr chunk (state=3): >>><<< 11661 1726882379.67516: stdout chunk (state=3): >>><<< 11661 1726882379.67519: done transferring module to remote 11661 1726882379.67521: _low_level_execute_command(): starting 11661 1726882379.67523: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882379.467586-12059-96980598354643/ /root/.ansible/tmp/ansible-tmp-1726882379.467586-12059-96980598354643/AnsiballZ_command.py && sleep 0' 11661 1726882379.69035: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882379.69039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882379.69077: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882379.69081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882379.69084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882379.69118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 11661 1726882379.69120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882379.69191: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882379.69286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882379.69436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882379.71224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882379.71302: stderr chunk (state=3): >>><<< 11661 1726882379.71305: stdout chunk (state=3): >>><<< 11661 1726882379.71397: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882379.71400: _low_level_execute_command(): starting 11661 1726882379.71403: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882379.467586-12059-96980598354643/AnsiballZ_command.py && sleep 0' 11661 1726882379.72880: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882379.72884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882379.72918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882379.72922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882379.72924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 11661 1726882379.73038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882379.73180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882379.73260: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882379.73265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882379.73397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882381.09222: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 6692 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 6692 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ grep -q 'inet [1-9]'\n+ ip addr show testbr\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if <<< 11661 1726882381.09231: stdout chunk (state=3): >>>! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:32:59.864768", "end": "2024-09-20 21:33:01.089115", "delta": "0:00:01.224347", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11661 1726882381.10537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882381.10596: stderr chunk (state=3): >>><<< 11661 1726882381.10599: stdout chunk (state=3): >>><<< 11661 1726882381.10622: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 6692 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 6692 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ grep -q 'inet [1-9]'\n+ ip addr show testbr\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:32:59.864768", "end": "2024-09-20 21:33:01.089115", "delta": "0:00:01.224347", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882381.10663: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882379.467586-12059-96980598354643/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882381.10671: _low_level_execute_command(): starting 11661 1726882381.10676: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882379.467586-12059-96980598354643/ > /dev/null 2>&1 && sleep 0' 11661 1726882381.11115: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882381.11133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882381.11151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882381.11165: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882381.11207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882381.11219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882381.11326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882381.13169: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882381.13214: stderr chunk (state=3): >>><<< 11661 1726882381.13217: stdout chunk (state=3): >>><<< 11661 1726882381.13229: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882381.13235: handler run complete 11661 1726882381.13254: Evaluated conditional (False): False 11661 1726882381.13261: attempt loop complete, returning result 11661 1726882381.13265: _execute() done 11661 1726882381.13268: dumping result to json 11661 1726882381.13275: done dumping result, returning 11661 1726882381.13284: done running TaskExecutor() for managed_node2/TASK: Create test interfaces [0e448fcc-3ce9-896b-2321-000000000012] 11661 1726882381.13289: sending task result for task 0e448fcc-3ce9-896b-2321-000000000012 11661 1726882381.13395: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000012 11661 1726882381.13399: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.224347", "end": "2024-09-20 21:33:01.089115", "rc": 0, "start": "2024-09-20 21:32:59.864768" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 6692 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 6692 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + grep -q 'inet [1-9]' + ip addr show testbr + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 11661 1726882381.13476: no more pending results, returning what we have 11661 1726882381.13480: results queue empty 11661 1726882381.13480: checking for any_errors_fatal 11661 1726882381.13488: done checking for any_errors_fatal 11661 1726882381.13489: checking for max_fail_percentage 11661 1726882381.13491: done checking for max_fail_percentage 11661 1726882381.13491: checking to see if all hosts have failed and the running result is not ok 11661 1726882381.13492: done checking to see if all hosts have failed 11661 1726882381.13493: getting the remaining hosts for this loop 11661 1726882381.13494: done getting the remaining hosts for this loop 11661 1726882381.13497: getting the next task for host managed_node2 11661 1726882381.13506: done getting next task for host managed_node2 11661 1726882381.13510: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11661 1726882381.13512: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882381.13515: getting variables 11661 1726882381.13517: in VariableManager get_vars() 11661 1726882381.13558: Calling all_inventory to load vars for managed_node2 11661 1726882381.13561: Calling groups_inventory to load vars for managed_node2 11661 1726882381.13563: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882381.13574: Calling all_plugins_play to load vars for managed_node2 11661 1726882381.13577: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882381.13579: Calling groups_plugins_play to load vars for managed_node2 11661 1726882381.13741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882381.13862: done with get_vars() 11661 1726882381.13872: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:33:01 -0400 (0:00:01.713) 0:00:09.852 ****** 11661 1726882381.13934: entering _queue_task() for managed_node2/include_tasks 11661 1726882381.14247: worker is 1 (out of 1 available) 11661 1726882381.14303: exiting _queue_task() for managed_node2/include_tasks 11661 1726882381.14315: done queuing things up, now waiting for results queue to drain 11661 1726882381.14317: waiting for pending results... 11661 1726882381.14431: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 11661 1726882381.14506: in run() - task 0e448fcc-3ce9-896b-2321-000000000016 11661 1726882381.14513: variable 'ansible_search_path' from source: unknown 11661 1726882381.14516: variable 'ansible_search_path' from source: unknown 11661 1726882381.14591: calling self._execute() 11661 1726882381.14643: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882381.14646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882381.14656: variable 'omit' from source: magic vars 11661 1726882381.14928: variable 'ansible_distribution_major_version' from source: facts 11661 1726882381.14937: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882381.14946: _execute() done 11661 1726882381.14951: dumping result to json 11661 1726882381.14954: done dumping result, returning 11661 1726882381.14957: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-896b-2321-000000000016] 11661 1726882381.14962: sending task result for task 0e448fcc-3ce9-896b-2321-000000000016 11661 1726882381.15044: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000016 11661 1726882381.15046: WORKER PROCESS EXITING 11661 1726882381.15108: no more pending results, returning what we have 11661 1726882381.15112: in VariableManager get_vars() 11661 1726882381.15146: Calling all_inventory to load vars for managed_node2 11661 1726882381.15149: Calling groups_inventory to load vars for managed_node2 11661 1726882381.15151: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882381.15161: Calling all_plugins_play to load vars for managed_node2 11661 1726882381.15162: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882381.15165: Calling groups_plugins_play to load vars for managed_node2 11661 1726882381.15272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882381.15382: done with get_vars() 11661 1726882381.15388: variable 'ansible_search_path' from source: unknown 11661 1726882381.15389: variable 'ansible_search_path' from source: unknown 11661 1726882381.15414: we have included files to process 11661 1726882381.15415: generating all_blocks data 11661 1726882381.15416: done generating all_blocks data 11661 1726882381.15417: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11661 1726882381.15417: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11661 1726882381.15419: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11661 1726882381.15573: done processing included file 11661 1726882381.15574: iterating over new_blocks loaded from include file 11661 1726882381.15575: in VariableManager get_vars() 11661 1726882381.15587: done with get_vars() 11661 1726882381.15589: filtering new block on tags 11661 1726882381.15600: done filtering new block on tags 11661 1726882381.15602: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 11661 1726882381.15604: extending task lists for all hosts with included blocks 11661 1726882381.15666: done extending task lists 11661 1726882381.15667: done processing included files 11661 1726882381.15668: results queue empty 11661 1726882381.15668: checking for any_errors_fatal 11661 1726882381.15672: done checking for any_errors_fatal 11661 1726882381.15672: checking for max_fail_percentage 11661 1726882381.15673: done checking for max_fail_percentage 11661 1726882381.15673: checking to see if all hosts have failed and the running result is not ok 11661 1726882381.15674: done checking to see if all hosts have failed 11661 1726882381.15674: getting the remaining hosts for this loop 11661 1726882381.15675: done getting the remaining hosts for this loop 11661 1726882381.15677: getting the next task for host managed_node2 11661 1726882381.15679: done getting next task for host managed_node2 11661 1726882381.15681: ^ task is: TASK: Get stat for interface {{ interface }} 11661 1726882381.15682: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882381.15684: getting variables 11661 1726882381.15684: in VariableManager get_vars() 11661 1726882381.15693: Calling all_inventory to load vars for managed_node2 11661 1726882381.15694: Calling groups_inventory to load vars for managed_node2 11661 1726882381.15695: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882381.15699: Calling all_plugins_play to load vars for managed_node2 11661 1726882381.15701: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882381.15703: Calling groups_plugins_play to load vars for managed_node2 11661 1726882381.15802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882381.15912: done with get_vars() 11661 1726882381.15919: done getting variables 11661 1726882381.16028: variable 'interface' from source: task vars 11661 1726882381.16032: variable 'dhcp_interface1' from source: play vars 11661 1726882381.16078: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:33:01 -0400 (0:00:00.021) 0:00:09.874 ****** 11661 1726882381.16107: entering _queue_task() for managed_node2/stat 11661 1726882381.16267: worker is 1 (out of 1 available) 11661 1726882381.16278: exiting _queue_task() for managed_node2/stat 11661 1726882381.16290: done queuing things up, now waiting for results queue to drain 11661 1726882381.16292: waiting for pending results... 11661 1726882381.16432: running TaskExecutor() for managed_node2/TASK: Get stat for interface test1 11661 1726882381.16501: in run() - task 0e448fcc-3ce9-896b-2321-000000000152 11661 1726882381.16513: variable 'ansible_search_path' from source: unknown 11661 1726882381.16516: variable 'ansible_search_path' from source: unknown 11661 1726882381.16540: calling self._execute() 11661 1726882381.16596: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882381.16606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882381.16614: variable 'omit' from source: magic vars 11661 1726882381.16850: variable 'ansible_distribution_major_version' from source: facts 11661 1726882381.16862: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882381.16869: variable 'omit' from source: magic vars 11661 1726882381.16906: variable 'omit' from source: magic vars 11661 1726882381.16970: variable 'interface' from source: task vars 11661 1726882381.16974: variable 'dhcp_interface1' from source: play vars 11661 1726882381.17017: variable 'dhcp_interface1' from source: play vars 11661 1726882381.17031: variable 'omit' from source: magic vars 11661 1726882381.17066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882381.17098: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882381.17112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882381.17127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882381.17137: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882381.17162: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882381.17166: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882381.17169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882381.17234: Set connection var ansible_connection to ssh 11661 1726882381.17240: Set connection var ansible_pipelining to False 11661 1726882381.17244: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882381.17253: Set connection var ansible_timeout to 10 11661 1726882381.17259: Set connection var ansible_shell_type to sh 11661 1726882381.17267: Set connection var ansible_shell_executable to /bin/sh 11661 1726882381.17282: variable 'ansible_shell_executable' from source: unknown 11661 1726882381.17285: variable 'ansible_connection' from source: unknown 11661 1726882381.17287: variable 'ansible_module_compression' from source: unknown 11661 1726882381.17291: variable 'ansible_shell_type' from source: unknown 11661 1726882381.17296: variable 'ansible_shell_executable' from source: unknown 11661 1726882381.17299: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882381.17305: variable 'ansible_pipelining' from source: unknown 11661 1726882381.17308: variable 'ansible_timeout' from source: unknown 11661 1726882381.17312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882381.17452: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11661 1726882381.17457: variable 'omit' from source: magic vars 11661 1726882381.17465: starting attempt loop 11661 1726882381.17468: running the handler 11661 1726882381.17477: _low_level_execute_command(): starting 11661 1726882381.17484: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882381.17985: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882381.18010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882381.18025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882381.18036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882381.18084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882381.18095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882381.18206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882381.19881: stdout chunk (state=3): >>>/root <<< 11661 1726882381.19984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882381.20036: stderr chunk (state=3): >>><<< 11661 1726882381.20039: stdout chunk (state=3): >>><<< 11661 1726882381.20067: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882381.20077: _low_level_execute_command(): starting 11661 1726882381.20082: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882381.2006636-12136-49335458029077 `" && echo ansible-tmp-1726882381.2006636-12136-49335458029077="` echo /root/.ansible/tmp/ansible-tmp-1726882381.2006636-12136-49335458029077 `" ) && sleep 0' 11661 1726882381.20523: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882381.20540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882381.20556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882381.20584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882381.20620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882381.20632: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882381.20740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882381.22651: stdout chunk (state=3): >>>ansible-tmp-1726882381.2006636-12136-49335458029077=/root/.ansible/tmp/ansible-tmp-1726882381.2006636-12136-49335458029077 <<< 11661 1726882381.22760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882381.22811: stderr chunk (state=3): >>><<< 11661 1726882381.22814: stdout chunk (state=3): >>><<< 11661 1726882381.22830: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882381.2006636-12136-49335458029077=/root/.ansible/tmp/ansible-tmp-1726882381.2006636-12136-49335458029077 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882381.22875: variable 'ansible_module_compression' from source: unknown 11661 1726882381.22922: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11661 1726882381.22952: variable 'ansible_facts' from source: unknown 11661 1726882381.23014: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882381.2006636-12136-49335458029077/AnsiballZ_stat.py 11661 1726882381.23119: Sending initial data 11661 1726882381.23128: Sent initial data (152 bytes) 11661 1726882381.23811: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882381.23815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882381.23847: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882381.23851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882381.23854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 11661 1726882381.23856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882381.23912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882381.23916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882381.23921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882381.24020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882381.25790: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882381.25885: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882381.25985: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpn60kaje2 /root/.ansible/tmp/ansible-tmp-1726882381.2006636-12136-49335458029077/AnsiballZ_stat.py <<< 11661 1726882381.26081: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882381.27106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882381.27215: stderr chunk (state=3): >>><<< 11661 1726882381.27218: stdout chunk (state=3): >>><<< 11661 1726882381.27236: done transferring module to remote 11661 1726882381.27247: _low_level_execute_command(): starting 11661 1726882381.27252: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882381.2006636-12136-49335458029077/ /root/.ansible/tmp/ansible-tmp-1726882381.2006636-12136-49335458029077/AnsiballZ_stat.py && sleep 0' 11661 1726882381.27723: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882381.27736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882381.27760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882381.27774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882381.27825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882381.27836: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882381.27944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882381.29772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882381.29825: stderr chunk (state=3): >>><<< 11661 1726882381.29828: stdout chunk (state=3): >>><<< 11661 1726882381.29845: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882381.29848: _low_level_execute_command(): starting 11661 1726882381.29857: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882381.2006636-12136-49335458029077/AnsiballZ_stat.py && sleep 0' 11661 1726882381.30312: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882381.30328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882381.30343: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882381.30355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882381.30367: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882381.30412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882381.30423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882381.30537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882381.44172: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25129, "dev": 21, "nlink": 1, "atime": 1726882379.8734004, "mtime": 1726882379.8734004, "ctime": 1726882379.8734004, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11661 1726882381.44976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882381.44981: stdout chunk (state=3): >>><<< 11661 1726882381.44987: stderr chunk (state=3): >>><<< 11661 1726882381.45003: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25129, "dev": 21, "nlink": 1, "atime": 1726882379.8734004, "mtime": 1726882379.8734004, "ctime": 1726882379.8734004, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882381.45060: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882381.2006636-12136-49335458029077/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882381.45071: _low_level_execute_command(): starting 11661 1726882381.45076: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882381.2006636-12136-49335458029077/ > /dev/null 2>&1 && sleep 0' 11661 1726882381.46369: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882381.47148: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882381.47162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882381.47179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882381.47220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882381.47227: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882381.47237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882381.47251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882381.47261: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882381.47269: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882381.47277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882381.47286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882381.47297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882381.47304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882381.47310: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882381.47319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882381.47397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882381.47415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882381.47428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882381.47856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882381.49520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882381.49524: stdout chunk (state=3): >>><<< 11661 1726882381.49526: stderr chunk (state=3): >>><<< 11661 1726882381.49579: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882381.49583: handler run complete 11661 1726882381.49855: attempt loop complete, returning result 11661 1726882381.49858: _execute() done 11661 1726882381.49861: dumping result to json 11661 1726882381.49867: done dumping result, returning 11661 1726882381.49870: done running TaskExecutor() for managed_node2/TASK: Get stat for interface test1 [0e448fcc-3ce9-896b-2321-000000000152] 11661 1726882381.49872: sending task result for task 0e448fcc-3ce9-896b-2321-000000000152 11661 1726882381.49954: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000152 11661 1726882381.49957: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726882379.8734004, "block_size": 4096, "blocks": 0, "ctime": 1726882379.8734004, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 25129, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1726882379.8734004, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11661 1726882381.50059: no more pending results, returning what we have 11661 1726882381.50063: results queue empty 11661 1726882381.50065: checking for any_errors_fatal 11661 1726882381.50067: done checking for any_errors_fatal 11661 1726882381.50068: checking for max_fail_percentage 11661 1726882381.50069: done checking for max_fail_percentage 11661 1726882381.50070: checking to see if all hosts have failed and the running result is not ok 11661 1726882381.50071: done checking to see if all hosts have failed 11661 1726882381.50072: getting the remaining hosts for this loop 11661 1726882381.50074: done getting the remaining hosts for this loop 11661 1726882381.50077: getting the next task for host managed_node2 11661 1726882381.50086: done getting next task for host managed_node2 11661 1726882381.50088: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11661 1726882381.50091: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882381.50096: getting variables 11661 1726882381.50098: in VariableManager get_vars() 11661 1726882381.50138: Calling all_inventory to load vars for managed_node2 11661 1726882381.50141: Calling groups_inventory to load vars for managed_node2 11661 1726882381.50143: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882381.50154: Calling all_plugins_play to load vars for managed_node2 11661 1726882381.50156: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882381.50160: Calling groups_plugins_play to load vars for managed_node2 11661 1726882381.50516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882381.50705: done with get_vars() 11661 1726882381.50715: done getting variables 11661 1726882381.50808: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 11661 1726882381.50925: variable 'interface' from source: task vars 11661 1726882381.50929: variable 'dhcp_interface1' from source: play vars 11661 1726882381.51244: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:33:01 -0400 (0:00:00.351) 0:00:10.226 ****** 11661 1726882381.51278: entering _queue_task() for managed_node2/assert 11661 1726882381.51284: Creating lock for assert 11661 1726882381.51539: worker is 1 (out of 1 available) 11661 1726882381.51550: exiting _queue_task() for managed_node2/assert 11661 1726882381.51562: done queuing things up, now waiting for results queue to drain 11661 1726882381.51565: waiting for pending results... 11661 1726882381.52815: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test1' 11661 1726882381.52936: in run() - task 0e448fcc-3ce9-896b-2321-000000000017 11661 1726882381.52958: variable 'ansible_search_path' from source: unknown 11661 1726882381.52968: variable 'ansible_search_path' from source: unknown 11661 1726882381.53005: calling self._execute() 11661 1726882381.53086: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882381.53095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882381.53109: variable 'omit' from source: magic vars 11661 1726882381.53441: variable 'ansible_distribution_major_version' from source: facts 11661 1726882381.54083: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882381.54096: variable 'omit' from source: magic vars 11661 1726882381.54144: variable 'omit' from source: magic vars 11661 1726882381.54253: variable 'interface' from source: task vars 11661 1726882381.54265: variable 'dhcp_interface1' from source: play vars 11661 1726882381.54331: variable 'dhcp_interface1' from source: play vars 11661 1726882381.54356: variable 'omit' from source: magic vars 11661 1726882381.54402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882381.54441: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882381.54473: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882381.54496: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882381.54512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882381.54546: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882381.54558: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882381.54567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882381.54658: Set connection var ansible_connection to ssh 11661 1726882381.55380: Set connection var ansible_pipelining to False 11661 1726882381.55392: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882381.55405: Set connection var ansible_timeout to 10 11661 1726882381.55412: Set connection var ansible_shell_type to sh 11661 1726882381.55424: Set connection var ansible_shell_executable to /bin/sh 11661 1726882381.55455: variable 'ansible_shell_executable' from source: unknown 11661 1726882381.55467: variable 'ansible_connection' from source: unknown 11661 1726882381.55475: variable 'ansible_module_compression' from source: unknown 11661 1726882381.55482: variable 'ansible_shell_type' from source: unknown 11661 1726882381.55490: variable 'ansible_shell_executable' from source: unknown 11661 1726882381.55496: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882381.55507: variable 'ansible_pipelining' from source: unknown 11661 1726882381.55514: variable 'ansible_timeout' from source: unknown 11661 1726882381.55522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882381.55666: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882381.55684: variable 'omit' from source: magic vars 11661 1726882381.55694: starting attempt loop 11661 1726882381.55701: running the handler 11661 1726882381.55843: variable 'interface_stat' from source: set_fact 11661 1726882381.55875: Evaluated conditional (interface_stat.stat.exists): True 11661 1726882381.55886: handler run complete 11661 1726882381.55904: attempt loop complete, returning result 11661 1726882381.55910: _execute() done 11661 1726882381.55917: dumping result to json 11661 1726882381.55923: done dumping result, returning 11661 1726882381.55934: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test1' [0e448fcc-3ce9-896b-2321-000000000017] 11661 1726882381.55945: sending task result for task 0e448fcc-3ce9-896b-2321-000000000017 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11661 1726882381.56102: no more pending results, returning what we have 11661 1726882381.56105: results queue empty 11661 1726882381.56106: checking for any_errors_fatal 11661 1726882381.56116: done checking for any_errors_fatal 11661 1726882381.56116: checking for max_fail_percentage 11661 1726882381.56118: done checking for max_fail_percentage 11661 1726882381.56119: checking to see if all hosts have failed and the running result is not ok 11661 1726882381.56119: done checking to see if all hosts have failed 11661 1726882381.56120: getting the remaining hosts for this loop 11661 1726882381.56122: done getting the remaining hosts for this loop 11661 1726882381.56125: getting the next task for host managed_node2 11661 1726882381.56134: done getting next task for host managed_node2 11661 1726882381.56136: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11661 1726882381.56139: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882381.56142: getting variables 11661 1726882381.56143: in VariableManager get_vars() 11661 1726882381.56185: Calling all_inventory to load vars for managed_node2 11661 1726882381.56188: Calling groups_inventory to load vars for managed_node2 11661 1726882381.56190: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882381.56202: Calling all_plugins_play to load vars for managed_node2 11661 1726882381.56204: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882381.56208: Calling groups_plugins_play to load vars for managed_node2 11661 1726882381.56357: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000017 11661 1726882381.56360: WORKER PROCESS EXITING 11661 1726882381.56386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882381.56588: done with get_vars() 11661 1726882381.56605: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:33:01 -0400 (0:00:00.054) 0:00:10.280 ****** 11661 1726882381.56771: entering _queue_task() for managed_node2/include_tasks 11661 1726882381.57317: worker is 1 (out of 1 available) 11661 1726882381.57329: exiting _queue_task() for managed_node2/include_tasks 11661 1726882381.57342: done queuing things up, now waiting for results queue to drain 11661 1726882381.57343: waiting for pending results... 11661 1726882381.58608: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 11661 1726882381.59579: in run() - task 0e448fcc-3ce9-896b-2321-00000000001b 11661 1726882381.59600: variable 'ansible_search_path' from source: unknown 11661 1726882381.59607: variable 'ansible_search_path' from source: unknown 11661 1726882381.59649: calling self._execute() 11661 1726882381.59737: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882381.59749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882381.59769: variable 'omit' from source: magic vars 11661 1726882381.60122: variable 'ansible_distribution_major_version' from source: facts 11661 1726882381.60782: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882381.60795: _execute() done 11661 1726882381.60804: dumping result to json 11661 1726882381.60812: done dumping result, returning 11661 1726882381.60822: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-896b-2321-00000000001b] 11661 1726882381.60832: sending task result for task 0e448fcc-3ce9-896b-2321-00000000001b 11661 1726882381.60968: no more pending results, returning what we have 11661 1726882381.60974: in VariableManager get_vars() 11661 1726882381.61019: Calling all_inventory to load vars for managed_node2 11661 1726882381.61022: Calling groups_inventory to load vars for managed_node2 11661 1726882381.61024: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882381.61038: Calling all_plugins_play to load vars for managed_node2 11661 1726882381.61041: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882381.61045: Calling groups_plugins_play to load vars for managed_node2 11661 1726882381.61205: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000001b 11661 1726882381.61208: WORKER PROCESS EXITING 11661 1726882381.61230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882381.61479: done with get_vars() 11661 1726882381.61487: variable 'ansible_search_path' from source: unknown 11661 1726882381.61488: variable 'ansible_search_path' from source: unknown 11661 1726882381.61527: we have included files to process 11661 1726882381.61528: generating all_blocks data 11661 1726882381.61530: done generating all_blocks data 11661 1726882381.61533: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11661 1726882381.61535: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11661 1726882381.61536: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11661 1726882381.61739: done processing included file 11661 1726882381.61741: iterating over new_blocks loaded from include file 11661 1726882381.61743: in VariableManager get_vars() 11661 1726882381.61763: done with get_vars() 11661 1726882381.61766: filtering new block on tags 11661 1726882381.61818: done filtering new block on tags 11661 1726882381.61821: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 11661 1726882381.61831: extending task lists for all hosts with included blocks 11661 1726882381.61937: done extending task lists 11661 1726882381.61939: done processing included files 11661 1726882381.61940: results queue empty 11661 1726882381.61940: checking for any_errors_fatal 11661 1726882381.61944: done checking for any_errors_fatal 11661 1726882381.61945: checking for max_fail_percentage 11661 1726882381.61946: done checking for max_fail_percentage 11661 1726882381.61946: checking to see if all hosts have failed and the running result is not ok 11661 1726882381.61947: done checking to see if all hosts have failed 11661 1726882381.61948: getting the remaining hosts for this loop 11661 1726882381.61949: done getting the remaining hosts for this loop 11661 1726882381.61951: getting the next task for host managed_node2 11661 1726882381.61955: done getting next task for host managed_node2 11661 1726882381.61957: ^ task is: TASK: Get stat for interface {{ interface }} 11661 1726882381.61961: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882381.61963: getting variables 11661 1726882381.61965: in VariableManager get_vars() 11661 1726882381.61978: Calling all_inventory to load vars for managed_node2 11661 1726882381.61980: Calling groups_inventory to load vars for managed_node2 11661 1726882381.61983: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882381.61987: Calling all_plugins_play to load vars for managed_node2 11661 1726882381.61990: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882381.61992: Calling groups_plugins_play to load vars for managed_node2 11661 1726882381.62138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882381.62335: done with get_vars() 11661 1726882381.62343: done getting variables 11661 1726882381.62505: variable 'interface' from source: task vars 11661 1726882381.62597: variable 'dhcp_interface2' from source: play vars 11661 1726882381.62681: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:33:01 -0400 (0:00:00.060) 0:00:10.340 ****** 11661 1726882381.62716: entering _queue_task() for managed_node2/stat 11661 1726882381.63007: worker is 1 (out of 1 available) 11661 1726882381.63024: exiting _queue_task() for managed_node2/stat 11661 1726882381.63036: done queuing things up, now waiting for results queue to drain 11661 1726882381.63037: waiting for pending results... 11661 1726882381.63323: running TaskExecutor() for managed_node2/TASK: Get stat for interface test2 11661 1726882381.63483: in run() - task 0e448fcc-3ce9-896b-2321-00000000016a 11661 1726882381.63502: variable 'ansible_search_path' from source: unknown 11661 1726882381.63510: variable 'ansible_search_path' from source: unknown 11661 1726882381.63548: calling self._execute() 11661 1726882381.63637: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882381.63649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882381.63667: variable 'omit' from source: magic vars 11661 1726882381.64038: variable 'ansible_distribution_major_version' from source: facts 11661 1726882381.64054: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882381.64065: variable 'omit' from source: magic vars 11661 1726882381.64119: variable 'omit' from source: magic vars 11661 1726882381.64213: variable 'interface' from source: task vars 11661 1726882381.64228: variable 'dhcp_interface2' from source: play vars 11661 1726882381.64296: variable 'dhcp_interface2' from source: play vars 11661 1726882381.64321: variable 'omit' from source: magic vars 11661 1726882381.64383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882381.64420: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882381.64455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882381.64481: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882381.64499: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882381.64534: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882381.64542: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882381.64557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882381.64668: Set connection var ansible_connection to ssh 11661 1726882381.64683: Set connection var ansible_pipelining to False 11661 1726882381.64694: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882381.64705: Set connection var ansible_timeout to 10 11661 1726882381.64712: Set connection var ansible_shell_type to sh 11661 1726882381.64723: Set connection var ansible_shell_executable to /bin/sh 11661 1726882381.64798: variable 'ansible_shell_executable' from source: unknown 11661 1726882381.64806: variable 'ansible_connection' from source: unknown 11661 1726882381.64813: variable 'ansible_module_compression' from source: unknown 11661 1726882381.64845: variable 'ansible_shell_type' from source: unknown 11661 1726882381.64877: variable 'ansible_shell_executable' from source: unknown 11661 1726882381.64891: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882381.64900: variable 'ansible_pipelining' from source: unknown 11661 1726882381.64962: variable 'ansible_timeout' from source: unknown 11661 1726882381.64985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882381.65841: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11661 1726882381.65985: variable 'omit' from source: magic vars 11661 1726882381.65997: starting attempt loop 11661 1726882381.66004: running the handler 11661 1726882381.66050: _low_level_execute_command(): starting 11661 1726882381.66076: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882381.67502: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882381.67522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882381.67537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882381.67555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882381.67600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882381.67616: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882381.67635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882381.67653: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882381.67667: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882381.67677: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882381.67686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882381.67697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882381.67709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882381.67718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882381.67732: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882381.67743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882381.67814: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882381.67837: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882381.67856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882381.68006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882381.69681: stdout chunk (state=3): >>>/root <<< 11661 1726882381.69877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882381.69880: stdout chunk (state=3): >>><<< 11661 1726882381.69883: stderr chunk (state=3): >>><<< 11661 1726882381.70001: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882381.70004: _low_level_execute_command(): starting 11661 1726882381.70007: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882381.6990504-12154-50287596964975 `" && echo ansible-tmp-1726882381.6990504-12154-50287596964975="` echo /root/.ansible/tmp/ansible-tmp-1726882381.6990504-12154-50287596964975 `" ) && sleep 0' 11661 1726882381.70599: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882381.70612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882381.70625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882381.70644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882381.70694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882381.70722: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882381.70749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882381.70779: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882381.70825: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882381.70838: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882381.70849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882381.70864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882381.70886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882381.70897: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882381.70918: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882381.70936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882381.71027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882381.71055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882381.71079: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882381.71228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882381.73168: stdout chunk (state=3): >>>ansible-tmp-1726882381.6990504-12154-50287596964975=/root/.ansible/tmp/ansible-tmp-1726882381.6990504-12154-50287596964975 <<< 11661 1726882381.73370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882381.73374: stdout chunk (state=3): >>><<< 11661 1726882381.73376: stderr chunk (state=3): >>><<< 11661 1726882381.73671: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882381.6990504-12154-50287596964975=/root/.ansible/tmp/ansible-tmp-1726882381.6990504-12154-50287596964975 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882381.73675: variable 'ansible_module_compression' from source: unknown 11661 1726882381.73677: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11661 1726882381.73679: variable 'ansible_facts' from source: unknown 11661 1726882381.73681: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882381.6990504-12154-50287596964975/AnsiballZ_stat.py 11661 1726882381.74013: Sending initial data 11661 1726882381.74016: Sent initial data (152 bytes) 11661 1726882381.75044: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882381.75870: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882381.75889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882381.75909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882381.75951: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882381.75965: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882381.75982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882381.75999: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882381.76010: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882381.76020: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882381.76030: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882381.76042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882381.76056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882381.76070: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882381.76080: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882381.76096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882381.76980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882381.77005: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882381.77021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882381.77157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882381.78992: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882381.79083: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882381.79183: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpjy4rwmc9 /root/.ansible/tmp/ansible-tmp-1726882381.6990504-12154-50287596964975/AnsiballZ_stat.py <<< 11661 1726882381.79278: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882381.80780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882381.81021: stderr chunk (state=3): >>><<< 11661 1726882381.81025: stdout chunk (state=3): >>><<< 11661 1726882381.81027: done transferring module to remote 11661 1726882381.81035: _low_level_execute_command(): starting 11661 1726882381.81037: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882381.6990504-12154-50287596964975/ /root/.ansible/tmp/ansible-tmp-1726882381.6990504-12154-50287596964975/AnsiballZ_stat.py && sleep 0' 11661 1726882381.81823: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882381.81841: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882381.81857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882381.81879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882381.81939: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882381.81960: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882381.81978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882381.81995: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882381.82006: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882381.82016: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882381.82026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882381.82038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882381.82059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882381.82078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882381.82091: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882381.82106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882381.82192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882381.82213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882381.82230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882381.82400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882381.84285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882381.84289: stdout chunk (state=3): >>><<< 11661 1726882381.84291: stderr chunk (state=3): >>><<< 11661 1726882381.84388: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882381.84392: _low_level_execute_command(): starting 11661 1726882381.84395: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882381.6990504-12154-50287596964975/AnsiballZ_stat.py && sleep 0' 11661 1726882381.85104: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882381.85160: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882381.85190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882381.85538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882381.85749: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882381.85763: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882381.85875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882381.85936: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882381.85998: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882381.86017: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882381.86053: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882381.86086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882381.86103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882381.86116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882381.86128: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882381.86142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882381.86217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882381.86239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882381.86256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882381.86530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882381.99746: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25715, "dev": 21, "nlink": 1, "atime": 1726882379.8813503, "mtime": 1726882379.8813503, "ctime": 1726882379.8813503, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11661 1726882382.00853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882382.00965: stderr chunk (state=3): >>><<< 11661 1726882382.00969: stdout chunk (state=3): >>><<< 11661 1726882382.01043: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25715, "dev": 21, "nlink": 1, "atime": 1726882379.8813503, "mtime": 1726882379.8813503, "ctime": 1726882379.8813503, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882382.01206: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882381.6990504-12154-50287596964975/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882382.01209: _low_level_execute_command(): starting 11661 1726882382.01212: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882381.6990504-12154-50287596964975/ > /dev/null 2>&1 && sleep 0' 11661 1726882382.02277: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882382.02301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882382.02335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882382.02353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882382.02487: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882382.02501: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882382.02518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882382.02536: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882382.02556: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882382.02571: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882382.02626: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882382.02640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882382.02687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882382.02711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882382.02739: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882382.02753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882382.02875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882382.02927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882382.02960: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882382.03135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882382.04975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882382.05097: stderr chunk (state=3): >>><<< 11661 1726882382.05117: stdout chunk (state=3): >>><<< 11661 1726882382.05236: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882382.05239: handler run complete 11661 1726882382.05242: attempt loop complete, returning result 11661 1726882382.05244: _execute() done 11661 1726882382.05327: dumping result to json 11661 1726882382.05338: done dumping result, returning 11661 1726882382.05340: done running TaskExecutor() for managed_node2/TASK: Get stat for interface test2 [0e448fcc-3ce9-896b-2321-00000000016a] 11661 1726882382.05343: sending task result for task 0e448fcc-3ce9-896b-2321-00000000016a 11661 1726882382.05433: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000016a 11661 1726882382.05436: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726882379.8813503, "block_size": 4096, "blocks": 0, "ctime": 1726882379.8813503, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 25715, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1726882379.8813503, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11661 1726882382.05535: no more pending results, returning what we have 11661 1726882382.05539: results queue empty 11661 1726882382.05539: checking for any_errors_fatal 11661 1726882382.05541: done checking for any_errors_fatal 11661 1726882382.05542: checking for max_fail_percentage 11661 1726882382.05543: done checking for max_fail_percentage 11661 1726882382.05544: checking to see if all hosts have failed and the running result is not ok 11661 1726882382.05545: done checking to see if all hosts have failed 11661 1726882382.05546: getting the remaining hosts for this loop 11661 1726882382.05547: done getting the remaining hosts for this loop 11661 1726882382.05551: getting the next task for host managed_node2 11661 1726882382.05559: done getting next task for host managed_node2 11661 1726882382.05562: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11661 1726882382.05567: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882382.05571: getting variables 11661 1726882382.05572: in VariableManager get_vars() 11661 1726882382.05613: Calling all_inventory to load vars for managed_node2 11661 1726882382.05620: Calling groups_inventory to load vars for managed_node2 11661 1726882382.05622: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882382.05633: Calling all_plugins_play to load vars for managed_node2 11661 1726882382.05635: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882382.05638: Calling groups_plugins_play to load vars for managed_node2 11661 1726882382.06142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882382.06447: done with get_vars() 11661 1726882382.06458: done getting variables 11661 1726882382.06539: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882382.06694: variable 'interface' from source: task vars 11661 1726882382.06722: variable 'dhcp_interface2' from source: play vars 11661 1726882382.06796: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:33:02 -0400 (0:00:00.441) 0:00:10.782 ****** 11661 1726882382.06916: entering _queue_task() for managed_node2/assert 11661 1726882382.07392: worker is 1 (out of 1 available) 11661 1726882382.07405: exiting _queue_task() for managed_node2/assert 11661 1726882382.07418: done queuing things up, now waiting for results queue to drain 11661 1726882382.07424: waiting for pending results... 11661 1726882382.08144: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test2' 11661 1726882382.08379: in run() - task 0e448fcc-3ce9-896b-2321-00000000001c 11661 1726882382.08432: variable 'ansible_search_path' from source: unknown 11661 1726882382.08468: variable 'ansible_search_path' from source: unknown 11661 1726882382.08947: calling self._execute() 11661 1726882382.09181: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882382.09248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882382.09287: variable 'omit' from source: magic vars 11661 1726882382.09871: variable 'ansible_distribution_major_version' from source: facts 11661 1726882382.09913: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882382.09923: variable 'omit' from source: magic vars 11661 1726882382.10047: variable 'omit' from source: magic vars 11661 1726882382.10212: variable 'interface' from source: task vars 11661 1726882382.10221: variable 'dhcp_interface2' from source: play vars 11661 1726882382.10298: variable 'dhcp_interface2' from source: play vars 11661 1726882382.10324: variable 'omit' from source: magic vars 11661 1726882382.10375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882382.10417: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882382.10440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882382.10469: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882382.10486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882382.10522: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882382.10530: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882382.10536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882382.10644: Set connection var ansible_connection to ssh 11661 1726882382.10655: Set connection var ansible_pipelining to False 11661 1726882382.10670: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882382.10684: Set connection var ansible_timeout to 10 11661 1726882382.10690: Set connection var ansible_shell_type to sh 11661 1726882382.10700: Set connection var ansible_shell_executable to /bin/sh 11661 1726882382.10728: variable 'ansible_shell_executable' from source: unknown 11661 1726882382.10735: variable 'ansible_connection' from source: unknown 11661 1726882382.10741: variable 'ansible_module_compression' from source: unknown 11661 1726882382.10746: variable 'ansible_shell_type' from source: unknown 11661 1726882382.10752: variable 'ansible_shell_executable' from source: unknown 11661 1726882382.10758: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882382.10766: variable 'ansible_pipelining' from source: unknown 11661 1726882382.10773: variable 'ansible_timeout' from source: unknown 11661 1726882382.10785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882382.10927: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882382.10946: variable 'omit' from source: magic vars 11661 1726882382.10981: starting attempt loop 11661 1726882382.10989: running the handler 11661 1726882382.11136: variable 'interface_stat' from source: set_fact 11661 1726882382.11167: Evaluated conditional (interface_stat.stat.exists): True 11661 1726882382.11177: handler run complete 11661 1726882382.11193: attempt loop complete, returning result 11661 1726882382.11199: _execute() done 11661 1726882382.11204: dumping result to json 11661 1726882382.11215: done dumping result, returning 11661 1726882382.11226: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'test2' [0e448fcc-3ce9-896b-2321-00000000001c] 11661 1726882382.11235: sending task result for task 0e448fcc-3ce9-896b-2321-00000000001c ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11661 1726882382.11469: no more pending results, returning what we have 11661 1726882382.11473: results queue empty 11661 1726882382.11473: checking for any_errors_fatal 11661 1726882382.11482: done checking for any_errors_fatal 11661 1726882382.11483: checking for max_fail_percentage 11661 1726882382.11484: done checking for max_fail_percentage 11661 1726882382.11485: checking to see if all hosts have failed and the running result is not ok 11661 1726882382.11486: done checking to see if all hosts have failed 11661 1726882382.11487: getting the remaining hosts for this loop 11661 1726882382.11488: done getting the remaining hosts for this loop 11661 1726882382.11492: getting the next task for host managed_node2 11661 1726882382.11500: done getting next task for host managed_node2 11661 1726882382.11503: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 11661 1726882382.11505: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882382.11509: getting variables 11661 1726882382.11511: in VariableManager get_vars() 11661 1726882382.11551: Calling all_inventory to load vars for managed_node2 11661 1726882382.11554: Calling groups_inventory to load vars for managed_node2 11661 1726882382.11556: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882382.11591: Calling all_plugins_play to load vars for managed_node2 11661 1726882382.11595: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882382.11600: Calling groups_plugins_play to load vars for managed_node2 11661 1726882382.11789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882382.12002: done with get_vars() 11661 1726882382.12013: done getting variables 11661 1726882382.12069: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:28 Friday 20 September 2024 21:33:02 -0400 (0:00:00.052) 0:00:10.834 ****** 11661 1726882382.12103: entering _queue_task() for managed_node2/command 11661 1726882382.12230: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000001c 11661 1726882382.12233: WORKER PROCESS EXITING 11661 1726882382.12621: worker is 1 (out of 1 available) 11661 1726882382.12633: exiting _queue_task() for managed_node2/command 11661 1726882382.12645: done queuing things up, now waiting for results queue to drain 11661 1726882382.12647: waiting for pending results... 11661 1726882382.12896: running TaskExecutor() for managed_node2/TASK: Backup the /etc/resolv.conf for initscript 11661 1726882382.12974: in run() - task 0e448fcc-3ce9-896b-2321-00000000001d 11661 1726882382.12999: variable 'ansible_search_path' from source: unknown 11661 1726882382.13043: calling self._execute() 11661 1726882382.13137: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882382.13149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882382.13167: variable 'omit' from source: magic vars 11661 1726882382.13757: variable 'ansible_distribution_major_version' from source: facts 11661 1726882382.13785: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882382.13909: variable 'network_provider' from source: set_fact 11661 1726882382.13921: Evaluated conditional (network_provider == "initscripts"): False 11661 1726882382.13928: when evaluation is False, skipping this task 11661 1726882382.13935: _execute() done 11661 1726882382.13942: dumping result to json 11661 1726882382.13950: done dumping result, returning 11661 1726882382.13961: done running TaskExecutor() for managed_node2/TASK: Backup the /etc/resolv.conf for initscript [0e448fcc-3ce9-896b-2321-00000000001d] 11661 1726882382.13973: sending task result for task 0e448fcc-3ce9-896b-2321-00000000001d skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11661 1726882382.14369: no more pending results, returning what we have 11661 1726882382.14374: results queue empty 11661 1726882382.14375: checking for any_errors_fatal 11661 1726882382.14382: done checking for any_errors_fatal 11661 1726882382.14383: checking for max_fail_percentage 11661 1726882382.14385: done checking for max_fail_percentage 11661 1726882382.14385: checking to see if all hosts have failed and the running result is not ok 11661 1726882382.14386: done checking to see if all hosts have failed 11661 1726882382.14387: getting the remaining hosts for this loop 11661 1726882382.14389: done getting the remaining hosts for this loop 11661 1726882382.14392: getting the next task for host managed_node2 11661 1726882382.14399: done getting next task for host managed_node2 11661 1726882382.14402: ^ task is: TASK: TEST Add Bond with 2 ports 11661 1726882382.14405: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882382.14408: getting variables 11661 1726882382.14410: in VariableManager get_vars() 11661 1726882382.14452: Calling all_inventory to load vars for managed_node2 11661 1726882382.14456: Calling groups_inventory to load vars for managed_node2 11661 1726882382.14459: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882382.14491: Calling all_plugins_play to load vars for managed_node2 11661 1726882382.14495: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882382.14499: Calling groups_plugins_play to load vars for managed_node2 11661 1726882382.15615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882382.16711: done with get_vars() 11661 1726882382.16726: done getting variables 11661 1726882382.16783: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000001d 11661 1726882382.16787: WORKER PROCESS EXITING 11661 1726882382.16882: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:33 Friday 20 September 2024 21:33:02 -0400 (0:00:00.048) 0:00:10.882 ****** 11661 1726882382.16913: entering _queue_task() for managed_node2/debug 11661 1726882382.17140: worker is 1 (out of 1 available) 11661 1726882382.17152: exiting _queue_task() for managed_node2/debug 11661 1726882382.17164: done queuing things up, now waiting for results queue to drain 11661 1726882382.17166: waiting for pending results... 11661 1726882382.17468: running TaskExecutor() for managed_node2/TASK: TEST Add Bond with 2 ports 11661 1726882382.17559: in run() - task 0e448fcc-3ce9-896b-2321-00000000001e 11661 1726882382.17580: variable 'ansible_search_path' from source: unknown 11661 1726882382.17624: calling self._execute() 11661 1726882382.17710: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882382.17724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882382.17739: variable 'omit' from source: magic vars 11661 1726882382.18116: variable 'ansible_distribution_major_version' from source: facts 11661 1726882382.18132: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882382.18143: variable 'omit' from source: magic vars 11661 1726882382.18194: variable 'omit' from source: magic vars 11661 1726882382.18260: variable 'omit' from source: magic vars 11661 1726882382.18313: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882382.18398: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882382.18440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882382.18487: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882382.18503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882382.18549: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882382.18559: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882382.18569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882382.18695: Set connection var ansible_connection to ssh 11661 1726882382.18711: Set connection var ansible_pipelining to False 11661 1726882382.18720: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882382.18731: Set connection var ansible_timeout to 10 11661 1726882382.18737: Set connection var ansible_shell_type to sh 11661 1726882382.18754: Set connection var ansible_shell_executable to /bin/sh 11661 1726882382.18780: variable 'ansible_shell_executable' from source: unknown 11661 1726882382.18788: variable 'ansible_connection' from source: unknown 11661 1726882382.18798: variable 'ansible_module_compression' from source: unknown 11661 1726882382.18808: variable 'ansible_shell_type' from source: unknown 11661 1726882382.18817: variable 'ansible_shell_executable' from source: unknown 11661 1726882382.18823: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882382.18829: variable 'ansible_pipelining' from source: unknown 11661 1726882382.18835: variable 'ansible_timeout' from source: unknown 11661 1726882382.18842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882382.18989: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882382.19004: variable 'omit' from source: magic vars 11661 1726882382.19018: starting attempt loop 11661 1726882382.19026: running the handler 11661 1726882382.19097: handler run complete 11661 1726882382.19136: attempt loop complete, returning result 11661 1726882382.19146: _execute() done 11661 1726882382.19168: dumping result to json 11661 1726882382.19191: done dumping result, returning 11661 1726882382.19221: done running TaskExecutor() for managed_node2/TASK: TEST Add Bond with 2 ports [0e448fcc-3ce9-896b-2321-00000000001e] 11661 1726882382.19257: sending task result for task 0e448fcc-3ce9-896b-2321-00000000001e 11661 1726882382.19387: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000001e 11661 1726882382.19389: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ################################################## 11661 1726882382.19436: no more pending results, returning what we have 11661 1726882382.19439: results queue empty 11661 1726882382.19440: checking for any_errors_fatal 11661 1726882382.19445: done checking for any_errors_fatal 11661 1726882382.19446: checking for max_fail_percentage 11661 1726882382.19447: done checking for max_fail_percentage 11661 1726882382.19448: checking to see if all hosts have failed and the running result is not ok 11661 1726882382.19449: done checking to see if all hosts have failed 11661 1726882382.19449: getting the remaining hosts for this loop 11661 1726882382.19451: done getting the remaining hosts for this loop 11661 1726882382.19455: getting the next task for host managed_node2 11661 1726882382.19463: done getting next task for host managed_node2 11661 1726882382.19469: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11661 1726882382.19474: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882382.19506: getting variables 11661 1726882382.19508: in VariableManager get_vars() 11661 1726882382.19547: Calling all_inventory to load vars for managed_node2 11661 1726882382.19550: Calling groups_inventory to load vars for managed_node2 11661 1726882382.19553: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882382.19583: Calling all_plugins_play to load vars for managed_node2 11661 1726882382.19587: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882382.19591: Calling groups_plugins_play to load vars for managed_node2 11661 1726882382.19723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882382.19853: done with get_vars() 11661 1726882382.19862: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:33:02 -0400 (0:00:00.030) 0:00:10.912 ****** 11661 1726882382.19931: entering _queue_task() for managed_node2/include_tasks 11661 1726882382.20886: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11661 1726882382.20900: in run() - task 0e448fcc-3ce9-896b-2321-000000000026 11661 1726882382.20904: variable 'ansible_search_path' from source: unknown 11661 1726882382.20907: variable 'ansible_search_path' from source: unknown 11661 1726882382.20909: calling self._execute() 11661 1726882382.20912: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882382.20915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882382.20895: worker is 1 (out of 1 available) 11661 1726882382.20919: exiting _queue_task() for managed_node2/include_tasks 11661 1726882382.20927: done queuing things up, now waiting for results queue to drain 11661 1726882382.20929: waiting for pending results... 11661 1726882382.21067: variable 'omit' from source: magic vars 11661 1726882382.21799: variable 'ansible_distribution_major_version' from source: facts 11661 1726882382.21822: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882382.21835: _execute() done 11661 1726882382.21843: dumping result to json 11661 1726882382.21851: done dumping result, returning 11661 1726882382.21866: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-896b-2321-000000000026] 11661 1726882382.21877: sending task result for task 0e448fcc-3ce9-896b-2321-000000000026 11661 1726882382.22009: no more pending results, returning what we have 11661 1726882382.22014: in VariableManager get_vars() 11661 1726882382.22065: Calling all_inventory to load vars for managed_node2 11661 1726882382.22068: Calling groups_inventory to load vars for managed_node2 11661 1726882382.22071: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882382.22083: Calling all_plugins_play to load vars for managed_node2 11661 1726882382.22086: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882382.22090: Calling groups_plugins_play to load vars for managed_node2 11661 1726882382.22496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882382.22933: done with get_vars() 11661 1726882382.22941: variable 'ansible_search_path' from source: unknown 11661 1726882382.22942: variable 'ansible_search_path' from source: unknown 11661 1726882382.22957: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000026 11661 1726882382.22960: WORKER PROCESS EXITING 11661 1726882382.22993: we have included files to process 11661 1726882382.22994: generating all_blocks data 11661 1726882382.22996: done generating all_blocks data 11661 1726882382.22998: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11661 1726882382.22999: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11661 1726882382.23001: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11661 1726882382.24470: done processing included file 11661 1726882382.24472: iterating over new_blocks loaded from include file 11661 1726882382.24474: in VariableManager get_vars() 11661 1726882382.24500: done with get_vars() 11661 1726882382.24502: filtering new block on tags 11661 1726882382.24520: done filtering new block on tags 11661 1726882382.24523: in VariableManager get_vars() 11661 1726882382.24546: done with get_vars() 11661 1726882382.24548: filtering new block on tags 11661 1726882382.24570: done filtering new block on tags 11661 1726882382.24573: in VariableManager get_vars() 11661 1726882382.24597: done with get_vars() 11661 1726882382.24598: filtering new block on tags 11661 1726882382.24616: done filtering new block on tags 11661 1726882382.24619: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 11661 1726882382.24624: extending task lists for all hosts with included blocks 11661 1726882382.25500: done extending task lists 11661 1726882382.25501: done processing included files 11661 1726882382.25502: results queue empty 11661 1726882382.25503: checking for any_errors_fatal 11661 1726882382.25506: done checking for any_errors_fatal 11661 1726882382.25507: checking for max_fail_percentage 11661 1726882382.25508: done checking for max_fail_percentage 11661 1726882382.25509: checking to see if all hosts have failed and the running result is not ok 11661 1726882382.25510: done checking to see if all hosts have failed 11661 1726882382.25511: getting the remaining hosts for this loop 11661 1726882382.25512: done getting the remaining hosts for this loop 11661 1726882382.25515: getting the next task for host managed_node2 11661 1726882382.25519: done getting next task for host managed_node2 11661 1726882382.25521: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11661 1726882382.25524: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882382.25533: getting variables 11661 1726882382.25535: in VariableManager get_vars() 11661 1726882382.25550: Calling all_inventory to load vars for managed_node2 11661 1726882382.25553: Calling groups_inventory to load vars for managed_node2 11661 1726882382.25555: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882382.25560: Calling all_plugins_play to load vars for managed_node2 11661 1726882382.25563: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882382.25568: Calling groups_plugins_play to load vars for managed_node2 11661 1726882382.25706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882382.25918: done with get_vars() 11661 1726882382.25928: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:33:02 -0400 (0:00:00.060) 0:00:10.973 ****** 11661 1726882382.26019: entering _queue_task() for managed_node2/setup 11661 1726882382.27142: worker is 1 (out of 1 available) 11661 1726882382.27156: exiting _queue_task() for managed_node2/setup 11661 1726882382.27170: done queuing things up, now waiting for results queue to drain 11661 1726882382.27172: waiting for pending results... 11661 1726882382.27453: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11661 1726882382.27603: in run() - task 0e448fcc-3ce9-896b-2321-000000000188 11661 1726882382.27625: variable 'ansible_search_path' from source: unknown 11661 1726882382.27632: variable 'ansible_search_path' from source: unknown 11661 1726882382.27795: calling self._execute() 11661 1726882382.27799: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882382.27802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882382.27806: variable 'omit' from source: magic vars 11661 1726882382.28188: variable 'ansible_distribution_major_version' from source: facts 11661 1726882382.28205: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882382.28448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882382.30685: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882382.30791: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882382.30857: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882382.30903: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882382.30946: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882382.31053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882382.31089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882382.31119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882382.31178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882382.31196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882382.31259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882382.31291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882382.31321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882382.31403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882382.31448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882382.31677: variable '__network_required_facts' from source: role '' defaults 11661 1726882382.31686: variable 'ansible_facts' from source: unknown 11661 1726882382.31775: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11661 1726882382.31779: when evaluation is False, skipping this task 11661 1726882382.31782: _execute() done 11661 1726882382.31784: dumping result to json 11661 1726882382.31786: done dumping result, returning 11661 1726882382.31792: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-896b-2321-000000000188] 11661 1726882382.31802: sending task result for task 0e448fcc-3ce9-896b-2321-000000000188 11661 1726882382.31886: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000188 11661 1726882382.31888: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11661 1726882382.31935: no more pending results, returning what we have 11661 1726882382.31939: results queue empty 11661 1726882382.31940: checking for any_errors_fatal 11661 1726882382.31941: done checking for any_errors_fatal 11661 1726882382.31942: checking for max_fail_percentage 11661 1726882382.31944: done checking for max_fail_percentage 11661 1726882382.31944: checking to see if all hosts have failed and the running result is not ok 11661 1726882382.31945: done checking to see if all hosts have failed 11661 1726882382.31946: getting the remaining hosts for this loop 11661 1726882382.31947: done getting the remaining hosts for this loop 11661 1726882382.31952: getting the next task for host managed_node2 11661 1726882382.31962: done getting next task for host managed_node2 11661 1726882382.31972: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11661 1726882382.31976: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882382.31989: getting variables 11661 1726882382.31990: in VariableManager get_vars() 11661 1726882382.32029: Calling all_inventory to load vars for managed_node2 11661 1726882382.32032: Calling groups_inventory to load vars for managed_node2 11661 1726882382.32034: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882382.32043: Calling all_plugins_play to load vars for managed_node2 11661 1726882382.32045: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882382.32048: Calling groups_plugins_play to load vars for managed_node2 11661 1726882382.32204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882382.32356: done with get_vars() 11661 1726882382.32366: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:33:02 -0400 (0:00:00.064) 0:00:11.038 ****** 11661 1726882382.32476: entering _queue_task() for managed_node2/stat 11661 1726882382.32720: worker is 1 (out of 1 available) 11661 1726882382.32733: exiting _queue_task() for managed_node2/stat 11661 1726882382.32744: done queuing things up, now waiting for results queue to drain 11661 1726882382.32746: waiting for pending results... 11661 1726882382.33014: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 11661 1726882382.33158: in run() - task 0e448fcc-3ce9-896b-2321-00000000018a 11661 1726882382.33181: variable 'ansible_search_path' from source: unknown 11661 1726882382.33193: variable 'ansible_search_path' from source: unknown 11661 1726882382.33234: calling self._execute() 11661 1726882382.33320: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882382.33332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882382.33346: variable 'omit' from source: magic vars 11661 1726882382.33724: variable 'ansible_distribution_major_version' from source: facts 11661 1726882382.33747: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882382.33919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11661 1726882382.34203: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11661 1726882382.34255: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11661 1726882382.34302: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11661 1726882382.34340: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11661 1726882382.34433: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11661 1726882382.34468: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11661 1726882382.34505: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882382.34537: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11661 1726882382.34635: variable '__network_is_ostree' from source: set_fact 11661 1726882382.34646: Evaluated conditional (not __network_is_ostree is defined): False 11661 1726882382.34656: when evaluation is False, skipping this task 11661 1726882382.34663: _execute() done 11661 1726882382.34672: dumping result to json 11661 1726882382.34678: done dumping result, returning 11661 1726882382.34688: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-896b-2321-00000000018a] 11661 1726882382.34698: sending task result for task 0e448fcc-3ce9-896b-2321-00000000018a 11661 1726882382.34799: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000018a 11661 1726882382.34806: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11661 1726882382.34872: no more pending results, returning what we have 11661 1726882382.34877: results queue empty 11661 1726882382.34877: checking for any_errors_fatal 11661 1726882382.34884: done checking for any_errors_fatal 11661 1726882382.34885: checking for max_fail_percentage 11661 1726882382.34886: done checking for max_fail_percentage 11661 1726882382.34887: checking to see if all hosts have failed and the running result is not ok 11661 1726882382.34888: done checking to see if all hosts have failed 11661 1726882382.34889: getting the remaining hosts for this loop 11661 1726882382.34891: done getting the remaining hosts for this loop 11661 1726882382.34894: getting the next task for host managed_node2 11661 1726882382.34901: done getting next task for host managed_node2 11661 1726882382.34905: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11661 1726882382.34909: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882382.34922: getting variables 11661 1726882382.34924: in VariableManager get_vars() 11661 1726882382.34967: Calling all_inventory to load vars for managed_node2 11661 1726882382.34970: Calling groups_inventory to load vars for managed_node2 11661 1726882382.34973: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882382.34983: Calling all_plugins_play to load vars for managed_node2 11661 1726882382.34985: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882382.34988: Calling groups_plugins_play to load vars for managed_node2 11661 1726882382.35177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882382.35407: done with get_vars() 11661 1726882382.35419: done getting variables 11661 1726882382.35716: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:33:02 -0400 (0:00:00.032) 0:00:11.070 ****** 11661 1726882382.35755: entering _queue_task() for managed_node2/set_fact 11661 1726882382.35986: worker is 1 (out of 1 available) 11661 1726882382.35999: exiting _queue_task() for managed_node2/set_fact 11661 1726882382.36011: done queuing things up, now waiting for results queue to drain 11661 1726882382.36013: waiting for pending results... 11661 1726882382.36281: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11661 1726882382.36423: in run() - task 0e448fcc-3ce9-896b-2321-00000000018b 11661 1726882382.36441: variable 'ansible_search_path' from source: unknown 11661 1726882382.36452: variable 'ansible_search_path' from source: unknown 11661 1726882382.36496: calling self._execute() 11661 1726882382.36585: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882382.36597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882382.36612: variable 'omit' from source: magic vars 11661 1726882382.36982: variable 'ansible_distribution_major_version' from source: facts 11661 1726882382.37004: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882382.37180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11661 1726882382.37521: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11661 1726882382.37577: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11661 1726882382.37615: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11661 1726882382.37659: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11661 1726882382.37746: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11661 1726882382.37783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11661 1726882382.37812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882382.37841: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11661 1726882382.37937: variable '__network_is_ostree' from source: set_fact 11661 1726882382.37952: Evaluated conditional (not __network_is_ostree is defined): False 11661 1726882382.37961: when evaluation is False, skipping this task 11661 1726882382.37974: _execute() done 11661 1726882382.37980: dumping result to json 11661 1726882382.37989: done dumping result, returning 11661 1726882382.38000: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-896b-2321-00000000018b] 11661 1726882382.38009: sending task result for task 0e448fcc-3ce9-896b-2321-00000000018b skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11661 1726882382.38161: no more pending results, returning what we have 11661 1726882382.38167: results queue empty 11661 1726882382.38168: checking for any_errors_fatal 11661 1726882382.38174: done checking for any_errors_fatal 11661 1726882382.38174: checking for max_fail_percentage 11661 1726882382.38176: done checking for max_fail_percentage 11661 1726882382.38177: checking to see if all hosts have failed and the running result is not ok 11661 1726882382.38178: done checking to see if all hosts have failed 11661 1726882382.38178: getting the remaining hosts for this loop 11661 1726882382.38180: done getting the remaining hosts for this loop 11661 1726882382.38184: getting the next task for host managed_node2 11661 1726882382.38195: done getting next task for host managed_node2 11661 1726882382.38199: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11661 1726882382.38203: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882382.38216: getting variables 11661 1726882382.38218: in VariableManager get_vars() 11661 1726882382.38261: Calling all_inventory to load vars for managed_node2 11661 1726882382.38265: Calling groups_inventory to load vars for managed_node2 11661 1726882382.38268: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882382.38278: Calling all_plugins_play to load vars for managed_node2 11661 1726882382.38281: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882382.38284: Calling groups_plugins_play to load vars for managed_node2 11661 1726882382.38497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882382.38703: done with get_vars() 11661 1726882382.38715: done getting variables 11661 1726882382.38775: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000018b 11661 1726882382.38783: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:33:02 -0400 (0:00:00.033) 0:00:11.104 ****** 11661 1726882382.39089: entering _queue_task() for managed_node2/service_facts 11661 1726882382.39091: Creating lock for service_facts 11661 1726882382.39341: worker is 1 (out of 1 available) 11661 1726882382.39356: exiting _queue_task() for managed_node2/service_facts 11661 1726882382.39370: done queuing things up, now waiting for results queue to drain 11661 1726882382.39372: waiting for pending results... 11661 1726882382.39632: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 11661 1726882382.39776: in run() - task 0e448fcc-3ce9-896b-2321-00000000018d 11661 1726882382.39794: variable 'ansible_search_path' from source: unknown 11661 1726882382.39802: variable 'ansible_search_path' from source: unknown 11661 1726882382.39844: calling self._execute() 11661 1726882382.39935: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882382.39947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882382.39967: variable 'omit' from source: magic vars 11661 1726882382.40340: variable 'ansible_distribution_major_version' from source: facts 11661 1726882382.40367: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882382.40380: variable 'omit' from source: magic vars 11661 1726882382.40457: variable 'omit' from source: magic vars 11661 1726882382.40501: variable 'omit' from source: magic vars 11661 1726882382.40543: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882382.40591: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882382.40615: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882382.40638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882382.40657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882382.40697: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882382.40706: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882382.40713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882382.40819: Set connection var ansible_connection to ssh 11661 1726882382.40830: Set connection var ansible_pipelining to False 11661 1726882382.40839: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882382.40853: Set connection var ansible_timeout to 10 11661 1726882382.40860: Set connection var ansible_shell_type to sh 11661 1726882382.40875: Set connection var ansible_shell_executable to /bin/sh 11661 1726882382.40904: variable 'ansible_shell_executable' from source: unknown 11661 1726882382.40912: variable 'ansible_connection' from source: unknown 11661 1726882382.40919: variable 'ansible_module_compression' from source: unknown 11661 1726882382.40924: variable 'ansible_shell_type' from source: unknown 11661 1726882382.40929: variable 'ansible_shell_executable' from source: unknown 11661 1726882382.40935: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882382.40941: variable 'ansible_pipelining' from source: unknown 11661 1726882382.40946: variable 'ansible_timeout' from source: unknown 11661 1726882382.40955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882382.41155: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11661 1726882382.41174: variable 'omit' from source: magic vars 11661 1726882382.41184: starting attempt loop 11661 1726882382.41192: running the handler 11661 1726882382.41210: _low_level_execute_command(): starting 11661 1726882382.41228: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882382.42018: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882382.42034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882382.42049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882382.42076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882382.42124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882382.42137: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882382.42153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882382.42175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882382.42187: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882382.42198: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882382.42214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882382.42228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882382.42244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882382.42261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882382.42276: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882382.42290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882382.42374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882382.42397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882382.42411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882382.42559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882382.44210: stdout chunk (state=3): >>>/root <<< 11661 1726882382.44315: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882382.44404: stderr chunk (state=3): >>><<< 11661 1726882382.44423: stdout chunk (state=3): >>><<< 11661 1726882382.44553: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882382.44557: _low_level_execute_command(): starting 11661 1726882382.44560: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882382.444536-12199-207860797340335 `" && echo ansible-tmp-1726882382.444536-12199-207860797340335="` echo /root/.ansible/tmp/ansible-tmp-1726882382.444536-12199-207860797340335 `" ) && sleep 0' 11661 1726882382.45172: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882382.45185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882382.45203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882382.45225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882382.45275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882382.45288: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882382.45302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882382.45330: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882382.45348: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882382.45355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882382.45424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882382.45458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882382.45593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882382.47472: stdout chunk (state=3): >>>ansible-tmp-1726882382.444536-12199-207860797340335=/root/.ansible/tmp/ansible-tmp-1726882382.444536-12199-207860797340335 <<< 11661 1726882382.47626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882382.47630: stderr chunk (state=3): >>><<< 11661 1726882382.47632: stdout chunk (state=3): >>><<< 11661 1726882382.47644: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882382.444536-12199-207860797340335=/root/.ansible/tmp/ansible-tmp-1726882382.444536-12199-207860797340335 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882382.47688: variable 'ansible_module_compression' from source: unknown 11661 1726882382.47726: ANSIBALLZ: Using lock for service_facts 11661 1726882382.47729: ANSIBALLZ: Acquiring lock 11661 1726882382.47732: ANSIBALLZ: Lock acquired: 139652594171488 11661 1726882382.47734: ANSIBALLZ: Creating module 11661 1726882382.65376: ANSIBALLZ: Writing module into payload 11661 1726882382.65542: ANSIBALLZ: Writing module 11661 1726882382.65604: ANSIBALLZ: Renaming module 11661 1726882382.65624: ANSIBALLZ: Done creating module 11661 1726882382.65657: variable 'ansible_facts' from source: unknown 11661 1726882382.65776: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882382.444536-12199-207860797340335/AnsiballZ_service_facts.py 11661 1726882382.66016: Sending initial data 11661 1726882382.66026: Sent initial data (161 bytes) 11661 1726882382.67463: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882382.67489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882382.67506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882382.67525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882382.67571: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882382.67592: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882382.67609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882382.67621: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882382.67628: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882382.67636: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882382.67644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882382.67655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882382.67670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882382.67693: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882382.67696: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882382.67699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882382.67807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882382.67831: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882382.67835: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882382.67979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882382.69802: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882382.69908: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882382.70007: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpiikdwp9d /root/.ansible/tmp/ansible-tmp-1726882382.444536-12199-207860797340335/AnsiballZ_service_facts.py <<< 11661 1726882382.70102: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882382.71786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882382.72129: stderr chunk (state=3): >>><<< 11661 1726882382.72133: stdout chunk (state=3): >>><<< 11661 1726882382.72135: done transferring module to remote 11661 1726882382.72137: _low_level_execute_command(): starting 11661 1726882382.72143: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882382.444536-12199-207860797340335/ /root/.ansible/tmp/ansible-tmp-1726882382.444536-12199-207860797340335/AnsiballZ_service_facts.py && sleep 0' 11661 1726882382.73771: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882382.73776: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882382.73786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882382.73812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882382.73815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 11661 1726882382.73819: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882382.73821: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882382.73897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882382.73900: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882382.73907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882382.74114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882382.75909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882382.75913: stdout chunk (state=3): >>><<< 11661 1726882382.75915: stderr chunk (state=3): >>><<< 11661 1726882382.76013: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882382.76017: _low_level_execute_command(): starting 11661 1726882382.76021: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882382.444536-12199-207860797340335/AnsiballZ_service_facts.py && sleep 0' 11661 1726882382.76884: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882382.76899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882382.76941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882382.76965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882382.77004: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882382.77007: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882382.77009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 11661 1726882382.77014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882382.77084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882382.77099: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882382.77110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882382.77233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882384.15447: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "sta<<< 11661 1726882384.15483: stdout chunk (state=3): >>>tic", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias"<<< 11661 1726882384.15488: stdout chunk (state=3): >>>, "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11661 1726882384.16735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882384.16815: stderr chunk (state=3): >>><<< 11661 1726882384.16818: stdout chunk (state=3): >>><<< 11661 1726882384.16841: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882384.17461: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882382.444536-12199-207860797340335/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882384.17471: _low_level_execute_command(): starting 11661 1726882384.17477: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882382.444536-12199-207860797340335/ > /dev/null 2>&1 && sleep 0' 11661 1726882384.18137: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882384.18147: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882384.18161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882384.18178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882384.18217: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882384.18224: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882384.18235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882384.18248: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882384.18267: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882384.18280: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882384.18293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882384.18308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882384.18324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882384.18336: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882384.18352: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882384.18368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882384.18444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882384.18471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882384.18488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882384.18619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882384.20589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882384.20618: stderr chunk (state=3): >>><<< 11661 1726882384.20622: stdout chunk (state=3): >>><<< 11661 1726882384.20774: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882384.20777: handler run complete 11661 1726882384.20842: variable 'ansible_facts' from source: unknown 11661 1726882384.21002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882384.21495: variable 'ansible_facts' from source: unknown 11661 1726882384.21569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882384.21691: attempt loop complete, returning result 11661 1726882384.21695: _execute() done 11661 1726882384.21697: dumping result to json 11661 1726882384.21729: done dumping result, returning 11661 1726882384.21738: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-896b-2321-00000000018d] 11661 1726882384.21743: sending task result for task 0e448fcc-3ce9-896b-2321-00000000018d 11661 1726882384.22214: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000018d 11661 1726882384.22217: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11661 1726882384.22263: no more pending results, returning what we have 11661 1726882384.22267: results queue empty 11661 1726882384.22268: checking for any_errors_fatal 11661 1726882384.22271: done checking for any_errors_fatal 11661 1726882384.22272: checking for max_fail_percentage 11661 1726882384.22273: done checking for max_fail_percentage 11661 1726882384.22274: checking to see if all hosts have failed and the running result is not ok 11661 1726882384.22274: done checking to see if all hosts have failed 11661 1726882384.22275: getting the remaining hosts for this loop 11661 1726882384.22276: done getting the remaining hosts for this loop 11661 1726882384.22279: getting the next task for host managed_node2 11661 1726882384.22284: done getting next task for host managed_node2 11661 1726882384.22287: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11661 1726882384.22290: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882384.22299: getting variables 11661 1726882384.22300: in VariableManager get_vars() 11661 1726882384.22327: Calling all_inventory to load vars for managed_node2 11661 1726882384.22329: Calling groups_inventory to load vars for managed_node2 11661 1726882384.22330: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882384.22337: Calling all_plugins_play to load vars for managed_node2 11661 1726882384.22338: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882384.22344: Calling groups_plugins_play to load vars for managed_node2 11661 1726882384.22558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882384.22826: done with get_vars() 11661 1726882384.22836: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:33:04 -0400 (0:00:01.838) 0:00:12.942 ****** 11661 1726882384.22906: entering _queue_task() for managed_node2/package_facts 11661 1726882384.22907: Creating lock for package_facts 11661 1726882384.23102: worker is 1 (out of 1 available) 11661 1726882384.23116: exiting _queue_task() for managed_node2/package_facts 11661 1726882384.23131: done queuing things up, now waiting for results queue to drain 11661 1726882384.23133: waiting for pending results... 11661 1726882384.23324: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 11661 1726882384.23436: in run() - task 0e448fcc-3ce9-896b-2321-00000000018e 11661 1726882384.23447: variable 'ansible_search_path' from source: unknown 11661 1726882384.23453: variable 'ansible_search_path' from source: unknown 11661 1726882384.23490: calling self._execute() 11661 1726882384.23559: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882384.23562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882384.23575: variable 'omit' from source: magic vars 11661 1726882384.23924: variable 'ansible_distribution_major_version' from source: facts 11661 1726882384.23940: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882384.23949: variable 'omit' from source: magic vars 11661 1726882384.24024: variable 'omit' from source: magic vars 11661 1726882384.24059: variable 'omit' from source: magic vars 11661 1726882384.24105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882384.24145: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882384.24173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882384.24194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882384.24212: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882384.24242: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882384.24249: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882384.24256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882384.24352: Set connection var ansible_connection to ssh 11661 1726882384.24369: Set connection var ansible_pipelining to False 11661 1726882384.24379: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882384.24390: Set connection var ansible_timeout to 10 11661 1726882384.24399: Set connection var ansible_shell_type to sh 11661 1726882384.24412: Set connection var ansible_shell_executable to /bin/sh 11661 1726882384.24435: variable 'ansible_shell_executable' from source: unknown 11661 1726882384.24442: variable 'ansible_connection' from source: unknown 11661 1726882384.24448: variable 'ansible_module_compression' from source: unknown 11661 1726882384.24454: variable 'ansible_shell_type' from source: unknown 11661 1726882384.24460: variable 'ansible_shell_executable' from source: unknown 11661 1726882384.24467: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882384.24479: variable 'ansible_pipelining' from source: unknown 11661 1726882384.24490: variable 'ansible_timeout' from source: unknown 11661 1726882384.24493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882384.24658: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11661 1726882384.24667: variable 'omit' from source: magic vars 11661 1726882384.24672: starting attempt loop 11661 1726882384.24694: running the handler 11661 1726882384.24699: _low_level_execute_command(): starting 11661 1726882384.24710: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882384.25437: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882384.25451: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882384.25470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882384.25489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882384.25529: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882384.25542: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882384.25555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882384.25576: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882384.25588: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882384.25598: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882384.25610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882384.25623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882384.25638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882384.25651: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882384.25667: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882384.25682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882384.25779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882384.25802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882384.25820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882384.25959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882384.27592: stdout chunk (state=3): >>>/root <<< 11661 1726882384.27710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882384.27813: stderr chunk (state=3): >>><<< 11661 1726882384.27816: stdout chunk (state=3): >>><<< 11661 1726882384.27893: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882384.27897: _low_level_execute_command(): starting 11661 1726882384.27900: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882384.2783735-12291-150954302611857 `" && echo ansible-tmp-1726882384.2783735-12291-150954302611857="` echo /root/.ansible/tmp/ansible-tmp-1726882384.2783735-12291-150954302611857 `" ) && sleep 0' 11661 1726882384.28613: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882384.28628: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882384.28643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882384.28672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882384.28711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882384.28715: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882384.28724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882384.28741: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882384.28744: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882384.28747: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882384.28768: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882384.28771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882384.28781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882384.28789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882384.28808: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882384.28811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882384.28878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882384.28895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882384.28912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882384.29038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882384.30937: stdout chunk (state=3): >>>ansible-tmp-1726882384.2783735-12291-150954302611857=/root/.ansible/tmp/ansible-tmp-1726882384.2783735-12291-150954302611857 <<< 11661 1726882384.31042: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882384.31137: stderr chunk (state=3): >>><<< 11661 1726882384.31148: stdout chunk (state=3): >>><<< 11661 1726882384.31281: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882384.2783735-12291-150954302611857=/root/.ansible/tmp/ansible-tmp-1726882384.2783735-12291-150954302611857 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882384.31284: variable 'ansible_module_compression' from source: unknown 11661 1726882384.31389: ANSIBALLZ: Using lock for package_facts 11661 1726882384.31393: ANSIBALLZ: Acquiring lock 11661 1726882384.31395: ANSIBALLZ: Lock acquired: 139652577858944 11661 1726882384.31397: ANSIBALLZ: Creating module 11661 1726882384.71553: ANSIBALLZ: Writing module into payload 11661 1726882384.71854: ANSIBALLZ: Writing module 11661 1726882384.71960: ANSIBALLZ: Renaming module 11661 1726882384.72040: ANSIBALLZ: Done creating module 11661 1726882384.72088: variable 'ansible_facts' from source: unknown 11661 1726882384.72512: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882384.2783735-12291-150954302611857/AnsiballZ_package_facts.py 11661 1726882384.73282: Sending initial data 11661 1726882384.73286: Sent initial data (162 bytes) 11661 1726882384.76118: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882384.76122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882384.76131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882384.77081: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882384.77092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882384.77106: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882384.77115: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882384.77120: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882384.77129: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882384.77138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882384.77152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882384.77158: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882384.77167: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882384.77177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882384.77246: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882384.77264: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882384.77271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882384.77406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882384.79269: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 11661 1726882384.79276: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882384.79372: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882384.79474: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpgn6s7nct /root/.ansible/tmp/ansible-tmp-1726882384.2783735-12291-150954302611857/AnsiballZ_package_facts.py <<< 11661 1726882384.79576: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882384.83023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882384.83027: stderr chunk (state=3): >>><<< 11661 1726882384.83030: stdout chunk (state=3): >>><<< 11661 1726882384.83056: done transferring module to remote 11661 1726882384.83066: _low_level_execute_command(): starting 11661 1726882384.83073: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882384.2783735-12291-150954302611857/ /root/.ansible/tmp/ansible-tmp-1726882384.2783735-12291-150954302611857/AnsiballZ_package_facts.py && sleep 0' 11661 1726882384.84787: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882384.84803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882384.84824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882384.84843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882384.84892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882384.84905: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882384.84919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882384.84941: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882384.84956: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882384.84969: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882384.84981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882384.84992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882384.85006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882384.85016: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882384.85027: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882384.85044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882384.85124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882384.85146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882384.85173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882384.85309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882384.87185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882384.87273: stderr chunk (state=3): >>><<< 11661 1726882384.87276: stdout chunk (state=3): >>><<< 11661 1726882384.87372: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882384.87375: _low_level_execute_command(): starting 11661 1726882384.87377: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882384.2783735-12291-150954302611857/AnsiballZ_package_facts.py && sleep 0' 11661 1726882384.87990: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882384.88004: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882384.88026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882384.88046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882384.88093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882384.88106: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882384.88122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882384.88147: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882384.88166: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882384.88193: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882384.88215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882384.88230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882384.88255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882384.88309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882384.88321: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882384.88334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882384.88430: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882384.88471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882384.88494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882384.88632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882385.35246: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": nu<<< 11661 1726882385.35359: stdout chunk (state=3): >>>ll, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202<<< 11661 1726882385.35403: stdout chunk (state=3): >>>", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-<<< 11661 1726882385.35424: stdout chunk (state=3): >>>base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source<<< 11661 1726882385.35471: stdout chunk (state=3): >>>": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": <<< 11661 1726882385.35495: stdout chunk (state=3): >>>"7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3<<< 11661 1726882385.35534: stdout chunk (state=3): >>>.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch<<< 11661 1726882385.35539: stdout chunk (state=3): >>>": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "r<<< 11661 1726882385.35566: stdout chunk (state=3): >>>pm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch":<<< 11661 1726882385.35585: stdout chunk (state=3): >>> "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"<<< 11661 1726882385.35592: stdout chunk (state=3): >>>}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11661 1726882385.37668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882385.37690: stderr chunk (state=3): >>><<< 11661 1726882385.37693: stdout chunk (state=3): >>><<< 11661 1726882385.37979: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882385.40287: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882384.2783735-12291-150954302611857/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882385.40304: _low_level_execute_command(): starting 11661 1726882385.40308: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882384.2783735-12291-150954302611857/ > /dev/null 2>&1 && sleep 0' 11661 1726882385.40749: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882385.40757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882385.40789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882385.40802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882385.40814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882385.40863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882385.40879: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882385.40886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882385.40995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882385.42868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882385.42908: stderr chunk (state=3): >>><<< 11661 1726882385.42911: stdout chunk (state=3): >>><<< 11661 1726882385.42926: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882385.42932: handler run complete 11661 1726882385.43820: variable 'ansible_facts' from source: unknown 11661 1726882385.44095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882385.45279: variable 'ansible_facts' from source: unknown 11661 1726882385.45815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882385.46623: attempt loop complete, returning result 11661 1726882385.46634: _execute() done 11661 1726882385.46636: dumping result to json 11661 1726882385.46782: done dumping result, returning 11661 1726882385.46787: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-896b-2321-00000000018e] 11661 1726882385.46792: sending task result for task 0e448fcc-3ce9-896b-2321-00000000018e ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11661 1726882385.48275: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000018e 11661 1726882385.48280: WORKER PROCESS EXITING 11661 1726882385.48286: no more pending results, returning what we have 11661 1726882385.48288: results queue empty 11661 1726882385.48288: checking for any_errors_fatal 11661 1726882385.48291: done checking for any_errors_fatal 11661 1726882385.48291: checking for max_fail_percentage 11661 1726882385.48292: done checking for max_fail_percentage 11661 1726882385.48293: checking to see if all hosts have failed and the running result is not ok 11661 1726882385.48293: done checking to see if all hosts have failed 11661 1726882385.48294: getting the remaining hosts for this loop 11661 1726882385.48294: done getting the remaining hosts for this loop 11661 1726882385.48297: getting the next task for host managed_node2 11661 1726882385.48302: done getting next task for host managed_node2 11661 1726882385.48304: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11661 1726882385.48306: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882385.48311: getting variables 11661 1726882385.48312: in VariableManager get_vars() 11661 1726882385.48338: Calling all_inventory to load vars for managed_node2 11661 1726882385.48340: Calling groups_inventory to load vars for managed_node2 11661 1726882385.48341: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882385.48347: Calling all_plugins_play to load vars for managed_node2 11661 1726882385.48349: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882385.48353: Calling groups_plugins_play to load vars for managed_node2 11661 1726882385.49056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882385.49985: done with get_vars() 11661 1726882385.50000: done getting variables 11661 1726882385.50042: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:33:05 -0400 (0:00:01.271) 0:00:14.214 ****** 11661 1726882385.50073: entering _queue_task() for managed_node2/debug 11661 1726882385.50274: worker is 1 (out of 1 available) 11661 1726882385.50288: exiting _queue_task() for managed_node2/debug 11661 1726882385.50300: done queuing things up, now waiting for results queue to drain 11661 1726882385.50302: waiting for pending results... 11661 1726882385.50474: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 11661 1726882385.50552: in run() - task 0e448fcc-3ce9-896b-2321-000000000027 11661 1726882385.50567: variable 'ansible_search_path' from source: unknown 11661 1726882385.50571: variable 'ansible_search_path' from source: unknown 11661 1726882385.50599: calling self._execute() 11661 1726882385.50667: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882385.50673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882385.50681: variable 'omit' from source: magic vars 11661 1726882385.50959: variable 'ansible_distribution_major_version' from source: facts 11661 1726882385.50972: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882385.50978: variable 'omit' from source: magic vars 11661 1726882385.51013: variable 'omit' from source: magic vars 11661 1726882385.51086: variable 'network_provider' from source: set_fact 11661 1726882385.51100: variable 'omit' from source: magic vars 11661 1726882385.51131: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882385.51162: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882385.51180: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882385.51193: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882385.51202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882385.51225: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882385.51228: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882385.51231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882385.51302: Set connection var ansible_connection to ssh 11661 1726882385.51306: Set connection var ansible_pipelining to False 11661 1726882385.51311: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882385.51318: Set connection var ansible_timeout to 10 11661 1726882385.51321: Set connection var ansible_shell_type to sh 11661 1726882385.51327: Set connection var ansible_shell_executable to /bin/sh 11661 1726882385.51342: variable 'ansible_shell_executable' from source: unknown 11661 1726882385.51345: variable 'ansible_connection' from source: unknown 11661 1726882385.51348: variable 'ansible_module_compression' from source: unknown 11661 1726882385.51352: variable 'ansible_shell_type' from source: unknown 11661 1726882385.51354: variable 'ansible_shell_executable' from source: unknown 11661 1726882385.51357: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882385.51361: variable 'ansible_pipelining' from source: unknown 11661 1726882385.51371: variable 'ansible_timeout' from source: unknown 11661 1726882385.51373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882385.51470: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882385.51484: variable 'omit' from source: magic vars 11661 1726882385.51487: starting attempt loop 11661 1726882385.51490: running the handler 11661 1726882385.51526: handler run complete 11661 1726882385.51536: attempt loop complete, returning result 11661 1726882385.51539: _execute() done 11661 1726882385.51542: dumping result to json 11661 1726882385.51544: done dumping result, returning 11661 1726882385.51550: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-896b-2321-000000000027] 11661 1726882385.51557: sending task result for task 0e448fcc-3ce9-896b-2321-000000000027 11661 1726882385.51637: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000027 11661 1726882385.51640: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 11661 1726882385.51698: no more pending results, returning what we have 11661 1726882385.51701: results queue empty 11661 1726882385.51702: checking for any_errors_fatal 11661 1726882385.51712: done checking for any_errors_fatal 11661 1726882385.51713: checking for max_fail_percentage 11661 1726882385.51714: done checking for max_fail_percentage 11661 1726882385.51715: checking to see if all hosts have failed and the running result is not ok 11661 1726882385.51716: done checking to see if all hosts have failed 11661 1726882385.51716: getting the remaining hosts for this loop 11661 1726882385.51718: done getting the remaining hosts for this loop 11661 1726882385.51721: getting the next task for host managed_node2 11661 1726882385.51727: done getting next task for host managed_node2 11661 1726882385.51731: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11661 1726882385.51733: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882385.51742: getting variables 11661 1726882385.51744: in VariableManager get_vars() 11661 1726882385.51784: Calling all_inventory to load vars for managed_node2 11661 1726882385.51787: Calling groups_inventory to load vars for managed_node2 11661 1726882385.51789: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882385.51797: Calling all_plugins_play to load vars for managed_node2 11661 1726882385.51799: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882385.51802: Calling groups_plugins_play to load vars for managed_node2 11661 1726882385.52680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882385.53626: done with get_vars() 11661 1726882385.53645: done getting variables 11661 1726882385.53716: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:33:05 -0400 (0:00:00.036) 0:00:14.250 ****** 11661 1726882385.53741: entering _queue_task() for managed_node2/fail 11661 1726882385.53742: Creating lock for fail 11661 1726882385.53971: worker is 1 (out of 1 available) 11661 1726882385.53986: exiting _queue_task() for managed_node2/fail 11661 1726882385.53998: done queuing things up, now waiting for results queue to drain 11661 1726882385.54000: waiting for pending results... 11661 1726882385.54161: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11661 1726882385.54244: in run() - task 0e448fcc-3ce9-896b-2321-000000000028 11661 1726882385.54254: variable 'ansible_search_path' from source: unknown 11661 1726882385.54265: variable 'ansible_search_path' from source: unknown 11661 1726882385.54296: calling self._execute() 11661 1726882385.54356: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882385.54360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882385.54370: variable 'omit' from source: magic vars 11661 1726882385.54630: variable 'ansible_distribution_major_version' from source: facts 11661 1726882385.54637: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882385.54720: variable 'network_state' from source: role '' defaults 11661 1726882385.54727: Evaluated conditional (network_state != {}): False 11661 1726882385.54732: when evaluation is False, skipping this task 11661 1726882385.54740: _execute() done 11661 1726882385.54744: dumping result to json 11661 1726882385.54746: done dumping result, returning 11661 1726882385.54753: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-896b-2321-000000000028] 11661 1726882385.54760: sending task result for task 0e448fcc-3ce9-896b-2321-000000000028 11661 1726882385.54844: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000028 11661 1726882385.54847: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11661 1726882385.54908: no more pending results, returning what we have 11661 1726882385.54912: results queue empty 11661 1726882385.54913: checking for any_errors_fatal 11661 1726882385.54919: done checking for any_errors_fatal 11661 1726882385.54920: checking for max_fail_percentage 11661 1726882385.54922: done checking for max_fail_percentage 11661 1726882385.54923: checking to see if all hosts have failed and the running result is not ok 11661 1726882385.54923: done checking to see if all hosts have failed 11661 1726882385.54924: getting the remaining hosts for this loop 11661 1726882385.54926: done getting the remaining hosts for this loop 11661 1726882385.54929: getting the next task for host managed_node2 11661 1726882385.54935: done getting next task for host managed_node2 11661 1726882385.54939: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11661 1726882385.54942: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882385.54954: getting variables 11661 1726882385.54956: in VariableManager get_vars() 11661 1726882385.54990: Calling all_inventory to load vars for managed_node2 11661 1726882385.54993: Calling groups_inventory to load vars for managed_node2 11661 1726882385.54995: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882385.55002: Calling all_plugins_play to load vars for managed_node2 11661 1726882385.55005: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882385.55007: Calling groups_plugins_play to load vars for managed_node2 11661 1726882385.55778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882385.56791: done with get_vars() 11661 1726882385.56805: done getting variables 11661 1726882385.56845: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:33:05 -0400 (0:00:00.031) 0:00:14.282 ****** 11661 1726882385.56869: entering _queue_task() for managed_node2/fail 11661 1726882385.57060: worker is 1 (out of 1 available) 11661 1726882385.57074: exiting _queue_task() for managed_node2/fail 11661 1726882385.57088: done queuing things up, now waiting for results queue to drain 11661 1726882385.57089: waiting for pending results... 11661 1726882385.57247: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11661 1726882385.57325: in run() - task 0e448fcc-3ce9-896b-2321-000000000029 11661 1726882385.57341: variable 'ansible_search_path' from source: unknown 11661 1726882385.57344: variable 'ansible_search_path' from source: unknown 11661 1726882385.57374: calling self._execute() 11661 1726882385.57440: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882385.57444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882385.57455: variable 'omit' from source: magic vars 11661 1726882385.57717: variable 'ansible_distribution_major_version' from source: facts 11661 1726882385.57726: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882385.57809: variable 'network_state' from source: role '' defaults 11661 1726882385.57817: Evaluated conditional (network_state != {}): False 11661 1726882385.57820: when evaluation is False, skipping this task 11661 1726882385.57822: _execute() done 11661 1726882385.57825: dumping result to json 11661 1726882385.57827: done dumping result, returning 11661 1726882385.57834: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-896b-2321-000000000029] 11661 1726882385.57839: sending task result for task 0e448fcc-3ce9-896b-2321-000000000029 11661 1726882385.57922: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000029 11661 1726882385.57925: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11661 1726882385.58003: no more pending results, returning what we have 11661 1726882385.58006: results queue empty 11661 1726882385.58007: checking for any_errors_fatal 11661 1726882385.58013: done checking for any_errors_fatal 11661 1726882385.58014: checking for max_fail_percentage 11661 1726882385.58015: done checking for max_fail_percentage 11661 1726882385.58016: checking to see if all hosts have failed and the running result is not ok 11661 1726882385.58016: done checking to see if all hosts have failed 11661 1726882385.58017: getting the remaining hosts for this loop 11661 1726882385.58018: done getting the remaining hosts for this loop 11661 1726882385.58021: getting the next task for host managed_node2 11661 1726882385.58026: done getting next task for host managed_node2 11661 1726882385.58030: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11661 1726882385.58032: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882385.58052: getting variables 11661 1726882385.58053: in VariableManager get_vars() 11661 1726882385.58080: Calling all_inventory to load vars for managed_node2 11661 1726882385.58082: Calling groups_inventory to load vars for managed_node2 11661 1726882385.58083: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882385.58089: Calling all_plugins_play to load vars for managed_node2 11661 1726882385.58091: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882385.58092: Calling groups_plugins_play to load vars for managed_node2 11661 1726882385.58852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882385.59799: done with get_vars() 11661 1726882385.59819: done getting variables 11661 1726882385.59867: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:33:05 -0400 (0:00:00.030) 0:00:14.312 ****** 11661 1726882385.59893: entering _queue_task() for managed_node2/fail 11661 1726882385.60110: worker is 1 (out of 1 available) 11661 1726882385.60124: exiting _queue_task() for managed_node2/fail 11661 1726882385.60138: done queuing things up, now waiting for results queue to drain 11661 1726882385.60140: waiting for pending results... 11661 1726882385.60324: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11661 1726882385.60408: in run() - task 0e448fcc-3ce9-896b-2321-00000000002a 11661 1726882385.60419: variable 'ansible_search_path' from source: unknown 11661 1726882385.60423: variable 'ansible_search_path' from source: unknown 11661 1726882385.60455: calling self._execute() 11661 1726882385.60514: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882385.60518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882385.60526: variable 'omit' from source: magic vars 11661 1726882385.60785: variable 'ansible_distribution_major_version' from source: facts 11661 1726882385.60795: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882385.60921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882385.63065: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882385.63110: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882385.63137: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882385.63163: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882385.63187: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882385.63243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882385.63265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882385.63282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882385.63313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882385.63323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882385.63391: variable 'ansible_distribution_major_version' from source: facts 11661 1726882385.63406: Evaluated conditional (ansible_distribution_major_version | int > 9): False 11661 1726882385.63409: when evaluation is False, skipping this task 11661 1726882385.63412: _execute() done 11661 1726882385.63414: dumping result to json 11661 1726882385.63416: done dumping result, returning 11661 1726882385.63423: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-896b-2321-00000000002a] 11661 1726882385.63430: sending task result for task 0e448fcc-3ce9-896b-2321-00000000002a 11661 1726882385.63514: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000002a 11661 1726882385.63516: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 11661 1726882385.63571: no more pending results, returning what we have 11661 1726882385.63575: results queue empty 11661 1726882385.63576: checking for any_errors_fatal 11661 1726882385.63581: done checking for any_errors_fatal 11661 1726882385.63582: checking for max_fail_percentage 11661 1726882385.63584: done checking for max_fail_percentage 11661 1726882385.63585: checking to see if all hosts have failed and the running result is not ok 11661 1726882385.63585: done checking to see if all hosts have failed 11661 1726882385.63586: getting the remaining hosts for this loop 11661 1726882385.63588: done getting the remaining hosts for this loop 11661 1726882385.63592: getting the next task for host managed_node2 11661 1726882385.63598: done getting next task for host managed_node2 11661 1726882385.63601: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11661 1726882385.63604: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882385.63617: getting variables 11661 1726882385.63619: in VariableManager get_vars() 11661 1726882385.63658: Calling all_inventory to load vars for managed_node2 11661 1726882385.63660: Calling groups_inventory to load vars for managed_node2 11661 1726882385.63662: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882385.63673: Calling all_plugins_play to load vars for managed_node2 11661 1726882385.63675: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882385.63678: Calling groups_plugins_play to load vars for managed_node2 11661 1726882385.64669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882385.66269: done with get_vars() 11661 1726882385.66297: done getting variables 11661 1726882385.66394: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:33:05 -0400 (0:00:00.065) 0:00:14.377 ****** 11661 1726882385.66426: entering _queue_task() for managed_node2/dnf 11661 1726882385.66714: worker is 1 (out of 1 available) 11661 1726882385.66727: exiting _queue_task() for managed_node2/dnf 11661 1726882385.66739: done queuing things up, now waiting for results queue to drain 11661 1726882385.66740: waiting for pending results... 11661 1726882385.67009: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11661 1726882385.67139: in run() - task 0e448fcc-3ce9-896b-2321-00000000002b 11661 1726882385.67161: variable 'ansible_search_path' from source: unknown 11661 1726882385.67170: variable 'ansible_search_path' from source: unknown 11661 1726882385.67210: calling self._execute() 11661 1726882385.67296: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882385.67307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882385.67319: variable 'omit' from source: magic vars 11661 1726882385.67677: variable 'ansible_distribution_major_version' from source: facts 11661 1726882385.67694: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882385.67894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882385.70152: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882385.70233: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882385.70279: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882385.70325: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882385.70357: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882385.70446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882385.70482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882385.70513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882385.70565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882385.70587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882385.70713: variable 'ansible_distribution' from source: facts 11661 1726882385.70723: variable 'ansible_distribution_major_version' from source: facts 11661 1726882385.70741: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11661 1726882385.70868: variable '__network_wireless_connections_defined' from source: role '' defaults 11661 1726882385.71001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882385.71029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882385.71058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882385.71108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882385.71128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882385.71175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882385.71207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882385.71237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882385.71282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882385.71306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882385.71350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882385.71380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882385.71414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882385.71458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882385.71480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882385.71641: variable 'network_connections' from source: task vars 11661 1726882385.71660: variable 'controller_profile' from source: play vars 11661 1726882385.71730: variable 'controller_profile' from source: play vars 11661 1726882385.71745: variable 'controller_device' from source: play vars 11661 1726882385.71811: variable 'controller_device' from source: play vars 11661 1726882385.71824: variable 'port1_profile' from source: play vars 11661 1726882385.71890: variable 'port1_profile' from source: play vars 11661 1726882385.71902: variable 'dhcp_interface1' from source: play vars 11661 1726882385.71969: variable 'dhcp_interface1' from source: play vars 11661 1726882385.71981: variable 'controller_profile' from source: play vars 11661 1726882385.72041: variable 'controller_profile' from source: play vars 11661 1726882385.72057: variable 'port2_profile' from source: play vars 11661 1726882385.72119: variable 'port2_profile' from source: play vars 11661 1726882385.72132: variable 'dhcp_interface2' from source: play vars 11661 1726882385.72196: variable 'dhcp_interface2' from source: play vars 11661 1726882385.72209: variable 'controller_profile' from source: play vars 11661 1726882385.72276: variable 'controller_profile' from source: play vars 11661 1726882385.72378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11661 1726882385.72552: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11661 1726882385.72597: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11661 1726882385.72632: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11661 1726882385.72670: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11661 1726882385.72702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11661 1726882385.72726: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11661 1726882385.72766: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882385.72785: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11661 1726882385.72831: variable '__network_team_connections_defined' from source: role '' defaults 11661 1726882385.72996: variable 'network_connections' from source: task vars 11661 1726882385.72999: variable 'controller_profile' from source: play vars 11661 1726882385.73042: variable 'controller_profile' from source: play vars 11661 1726882385.73048: variable 'controller_device' from source: play vars 11661 1726882385.73094: variable 'controller_device' from source: play vars 11661 1726882385.73101: variable 'port1_profile' from source: play vars 11661 1726882385.73143: variable 'port1_profile' from source: play vars 11661 1726882385.73147: variable 'dhcp_interface1' from source: play vars 11661 1726882385.73193: variable 'dhcp_interface1' from source: play vars 11661 1726882385.73198: variable 'controller_profile' from source: play vars 11661 1726882385.73239: variable 'controller_profile' from source: play vars 11661 1726882385.73245: variable 'port2_profile' from source: play vars 11661 1726882385.73291: variable 'port2_profile' from source: play vars 11661 1726882385.73296: variable 'dhcp_interface2' from source: play vars 11661 1726882385.73337: variable 'dhcp_interface2' from source: play vars 11661 1726882385.73342: variable 'controller_profile' from source: play vars 11661 1726882385.73388: variable 'controller_profile' from source: play vars 11661 1726882385.73411: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11661 1726882385.73414: when evaluation is False, skipping this task 11661 1726882385.73417: _execute() done 11661 1726882385.73419: dumping result to json 11661 1726882385.73421: done dumping result, returning 11661 1726882385.73429: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-896b-2321-00000000002b] 11661 1726882385.73434: sending task result for task 0e448fcc-3ce9-896b-2321-00000000002b 11661 1726882385.73521: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000002b 11661 1726882385.73524: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11661 1726882385.73580: no more pending results, returning what we have 11661 1726882385.73584: results queue empty 11661 1726882385.73585: checking for any_errors_fatal 11661 1726882385.73591: done checking for any_errors_fatal 11661 1726882385.73591: checking for max_fail_percentage 11661 1726882385.73593: done checking for max_fail_percentage 11661 1726882385.73594: checking to see if all hosts have failed and the running result is not ok 11661 1726882385.73594: done checking to see if all hosts have failed 11661 1726882385.73595: getting the remaining hosts for this loop 11661 1726882385.73596: done getting the remaining hosts for this loop 11661 1726882385.73599: getting the next task for host managed_node2 11661 1726882385.73606: done getting next task for host managed_node2 11661 1726882385.73610: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11661 1726882385.73613: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882385.73626: getting variables 11661 1726882385.73628: in VariableManager get_vars() 11661 1726882385.73671: Calling all_inventory to load vars for managed_node2 11661 1726882385.73674: Calling groups_inventory to load vars for managed_node2 11661 1726882385.73676: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882385.73684: Calling all_plugins_play to load vars for managed_node2 11661 1726882385.73687: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882385.73690: Calling groups_plugins_play to load vars for managed_node2 11661 1726882385.74674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882385.76105: done with get_vars() 11661 1726882385.76124: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11661 1726882385.76181: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:33:05 -0400 (0:00:00.097) 0:00:14.475 ****** 11661 1726882385.76204: entering _queue_task() for managed_node2/yum 11661 1726882385.76205: Creating lock for yum 11661 1726882385.76421: worker is 1 (out of 1 available) 11661 1726882385.76435: exiting _queue_task() for managed_node2/yum 11661 1726882385.76447: done queuing things up, now waiting for results queue to drain 11661 1726882385.76449: waiting for pending results... 11661 1726882385.76614: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11661 1726882385.76695: in run() - task 0e448fcc-3ce9-896b-2321-00000000002c 11661 1726882385.76712: variable 'ansible_search_path' from source: unknown 11661 1726882385.76716: variable 'ansible_search_path' from source: unknown 11661 1726882385.76743: calling self._execute() 11661 1726882385.76807: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882385.76816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882385.76823: variable 'omit' from source: magic vars 11661 1726882385.77077: variable 'ansible_distribution_major_version' from source: facts 11661 1726882385.77087: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882385.77205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882385.79877: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882385.79920: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882385.79947: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882385.79976: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882385.79996: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882385.80055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882385.80078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882385.80095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882385.80121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882385.80135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882385.80205: variable 'ansible_distribution_major_version' from source: facts 11661 1726882385.80217: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11661 1726882385.80220: when evaluation is False, skipping this task 11661 1726882385.80223: _execute() done 11661 1726882385.80225: dumping result to json 11661 1726882385.80233: done dumping result, returning 11661 1726882385.80239: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-896b-2321-00000000002c] 11661 1726882385.80244: sending task result for task 0e448fcc-3ce9-896b-2321-00000000002c 11661 1726882385.80329: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000002c 11661 1726882385.80331: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11661 1726882385.80386: no more pending results, returning what we have 11661 1726882385.80390: results queue empty 11661 1726882385.80391: checking for any_errors_fatal 11661 1726882385.80396: done checking for any_errors_fatal 11661 1726882385.80397: checking for max_fail_percentage 11661 1726882385.80399: done checking for max_fail_percentage 11661 1726882385.80400: checking to see if all hosts have failed and the running result is not ok 11661 1726882385.80401: done checking to see if all hosts have failed 11661 1726882385.80401: getting the remaining hosts for this loop 11661 1726882385.80403: done getting the remaining hosts for this loop 11661 1726882385.80406: getting the next task for host managed_node2 11661 1726882385.80413: done getting next task for host managed_node2 11661 1726882385.80417: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11661 1726882385.80420: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882385.80433: getting variables 11661 1726882385.80435: in VariableManager get_vars() 11661 1726882385.80478: Calling all_inventory to load vars for managed_node2 11661 1726882385.80481: Calling groups_inventory to load vars for managed_node2 11661 1726882385.80483: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882385.80491: Calling all_plugins_play to load vars for managed_node2 11661 1726882385.80493: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882385.80495: Calling groups_plugins_play to load vars for managed_node2 11661 1726882385.81490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882385.82968: done with get_vars() 11661 1726882385.82992: done getting variables 11661 1726882385.83050: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:33:05 -0400 (0:00:00.068) 0:00:14.544 ****** 11661 1726882385.83088: entering _queue_task() for managed_node2/fail 11661 1726882385.83369: worker is 1 (out of 1 available) 11661 1726882385.83381: exiting _queue_task() for managed_node2/fail 11661 1726882385.83392: done queuing things up, now waiting for results queue to drain 11661 1726882385.83394: waiting for pending results... 11661 1726882385.83659: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11661 1726882385.83793: in run() - task 0e448fcc-3ce9-896b-2321-00000000002d 11661 1726882385.83813: variable 'ansible_search_path' from source: unknown 11661 1726882385.83821: variable 'ansible_search_path' from source: unknown 11661 1726882385.83867: calling self._execute() 11661 1726882385.83949: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882385.83961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882385.83979: variable 'omit' from source: magic vars 11661 1726882385.84277: variable 'ansible_distribution_major_version' from source: facts 11661 1726882385.84287: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882385.84366: variable '__network_wireless_connections_defined' from source: role '' defaults 11661 1726882385.84498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882385.86086: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882385.86152: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882385.86195: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882385.86234: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882385.86269: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882385.86350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882385.86386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882385.86414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882385.86459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882385.86487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882385.86532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882385.86561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882385.86595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882385.86637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882385.86657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882385.86706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882385.86733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882385.86761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882385.86811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882385.86831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882385.87014: variable 'network_connections' from source: task vars 11661 1726882385.87031: variable 'controller_profile' from source: play vars 11661 1726882385.87101: variable 'controller_profile' from source: play vars 11661 1726882385.87232: variable 'controller_device' from source: play vars 11661 1726882385.87297: variable 'controller_device' from source: play vars 11661 1726882385.87347: variable 'port1_profile' from source: play vars 11661 1726882385.87417: variable 'port1_profile' from source: play vars 11661 1726882385.87430: variable 'dhcp_interface1' from source: play vars 11661 1726882385.87502: variable 'dhcp_interface1' from source: play vars 11661 1726882385.87514: variable 'controller_profile' from source: play vars 11661 1726882385.87581: variable 'controller_profile' from source: play vars 11661 1726882385.87594: variable 'port2_profile' from source: play vars 11661 1726882385.87662: variable 'port2_profile' from source: play vars 11661 1726882385.87679: variable 'dhcp_interface2' from source: play vars 11661 1726882385.87742: variable 'dhcp_interface2' from source: play vars 11661 1726882385.87754: variable 'controller_profile' from source: play vars 11661 1726882385.87823: variable 'controller_profile' from source: play vars 11661 1726882385.87903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11661 1726882385.88099: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11661 1726882385.88140: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11661 1726882385.88210: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11661 1726882385.88243: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11661 1726882385.88292: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11661 1726882385.88325: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11661 1726882385.88356: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882385.88390: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11661 1726882385.88471: variable '__network_team_connections_defined' from source: role '' defaults 11661 1726882385.88725: variable 'network_connections' from source: task vars 11661 1726882385.88738: variable 'controller_profile' from source: play vars 11661 1726882385.88807: variable 'controller_profile' from source: play vars 11661 1726882385.88820: variable 'controller_device' from source: play vars 11661 1726882385.88889: variable 'controller_device' from source: play vars 11661 1726882385.88905: variable 'port1_profile' from source: play vars 11661 1726882385.88974: variable 'port1_profile' from source: play vars 11661 1726882385.88988: variable 'dhcp_interface1' from source: play vars 11661 1726882385.89051: variable 'dhcp_interface1' from source: play vars 11661 1726882385.89067: variable 'controller_profile' from source: play vars 11661 1726882385.89130: variable 'controller_profile' from source: play vars 11661 1726882385.89142: variable 'port2_profile' from source: play vars 11661 1726882385.89209: variable 'port2_profile' from source: play vars 11661 1726882385.89222: variable 'dhcp_interface2' from source: play vars 11661 1726882385.89286: variable 'dhcp_interface2' from source: play vars 11661 1726882385.89301: variable 'controller_profile' from source: play vars 11661 1726882385.89365: variable 'controller_profile' from source: play vars 11661 1726882385.89409: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11661 1726882385.89416: when evaluation is False, skipping this task 11661 1726882385.89423: _execute() done 11661 1726882385.89430: dumping result to json 11661 1726882385.89436: done dumping result, returning 11661 1726882385.89447: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-896b-2321-00000000002d] 11661 1726882385.89457: sending task result for task 0e448fcc-3ce9-896b-2321-00000000002d skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11661 1726882385.89615: no more pending results, returning what we have 11661 1726882385.89619: results queue empty 11661 1726882385.89620: checking for any_errors_fatal 11661 1726882385.89626: done checking for any_errors_fatal 11661 1726882385.89627: checking for max_fail_percentage 11661 1726882385.89629: done checking for max_fail_percentage 11661 1726882385.89629: checking to see if all hosts have failed and the running result is not ok 11661 1726882385.89630: done checking to see if all hosts have failed 11661 1726882385.89631: getting the remaining hosts for this loop 11661 1726882385.89633: done getting the remaining hosts for this loop 11661 1726882385.89637: getting the next task for host managed_node2 11661 1726882385.89644: done getting next task for host managed_node2 11661 1726882385.89648: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11661 1726882385.89651: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882385.89670: getting variables 11661 1726882385.89672: in VariableManager get_vars() 11661 1726882385.89716: Calling all_inventory to load vars for managed_node2 11661 1726882385.89719: Calling groups_inventory to load vars for managed_node2 11661 1726882385.89721: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882385.89731: Calling all_plugins_play to load vars for managed_node2 11661 1726882385.89734: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882385.89738: Calling groups_plugins_play to load vars for managed_node2 11661 1726882385.90785: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000002d 11661 1726882385.90789: WORKER PROCESS EXITING 11661 1726882385.90921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882385.91956: done with get_vars() 11661 1726882385.91975: done getting variables 11661 1726882385.92018: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:33:05 -0400 (0:00:00.089) 0:00:14.633 ****** 11661 1726882385.92044: entering _queue_task() for managed_node2/package 11661 1726882385.92262: worker is 1 (out of 1 available) 11661 1726882385.92276: exiting _queue_task() for managed_node2/package 11661 1726882385.92290: done queuing things up, now waiting for results queue to drain 11661 1726882385.92292: waiting for pending results... 11661 1726882385.92458: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 11661 1726882385.92546: in run() - task 0e448fcc-3ce9-896b-2321-00000000002e 11661 1726882385.92560: variable 'ansible_search_path' from source: unknown 11661 1726882385.92566: variable 'ansible_search_path' from source: unknown 11661 1726882385.92596: calling self._execute() 11661 1726882385.92662: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882385.92670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882385.92679: variable 'omit' from source: magic vars 11661 1726882385.92945: variable 'ansible_distribution_major_version' from source: facts 11661 1726882385.92957: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882385.93093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11661 1726882385.93289: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11661 1726882385.93321: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11661 1726882385.93346: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11661 1726882385.93376: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11661 1726882385.93449: variable 'network_packages' from source: role '' defaults 11661 1726882385.93527: variable '__network_provider_setup' from source: role '' defaults 11661 1726882385.93535: variable '__network_service_name_default_nm' from source: role '' defaults 11661 1726882385.93583: variable '__network_service_name_default_nm' from source: role '' defaults 11661 1726882385.93590: variable '__network_packages_default_nm' from source: role '' defaults 11661 1726882385.93634: variable '__network_packages_default_nm' from source: role '' defaults 11661 1726882385.93786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882385.95193: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882385.95245: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882385.95275: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882385.95298: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882385.95320: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882385.95379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882385.95398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882385.95417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882385.95445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882385.95459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882385.95493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882385.95509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882385.95525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882385.95556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882385.95569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882385.95712: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11661 1726882385.95788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882385.95804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882385.95821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882385.95845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882385.95858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882385.95924: variable 'ansible_python' from source: facts 11661 1726882385.95943: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11661 1726882385.96004: variable '__network_wpa_supplicant_required' from source: role '' defaults 11661 1726882385.96054: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11661 1726882385.96142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882385.96160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882385.96179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882385.96207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882385.96218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882385.96249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882385.96271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882385.96288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882385.96317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882385.96327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882385.96419: variable 'network_connections' from source: task vars 11661 1726882385.96426: variable 'controller_profile' from source: play vars 11661 1726882385.96496: variable 'controller_profile' from source: play vars 11661 1726882385.96503: variable 'controller_device' from source: play vars 11661 1726882385.96574: variable 'controller_device' from source: play vars 11661 1726882385.96584: variable 'port1_profile' from source: play vars 11661 1726882385.96654: variable 'port1_profile' from source: play vars 11661 1726882385.96661: variable 'dhcp_interface1' from source: play vars 11661 1726882385.96730: variable 'dhcp_interface1' from source: play vars 11661 1726882385.96734: variable 'controller_profile' from source: play vars 11661 1726882385.96803: variable 'controller_profile' from source: play vars 11661 1726882385.96811: variable 'port2_profile' from source: play vars 11661 1726882385.96881: variable 'port2_profile' from source: play vars 11661 1726882385.96889: variable 'dhcp_interface2' from source: play vars 11661 1726882385.96959: variable 'dhcp_interface2' from source: play vars 11661 1726882385.96962: variable 'controller_profile' from source: play vars 11661 1726882385.97028: variable 'controller_profile' from source: play vars 11661 1726882385.97084: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11661 1726882385.97103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11661 1726882385.97123: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882385.97143: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11661 1726882385.97186: variable '__network_wireless_connections_defined' from source: role '' defaults 11661 1726882385.97367: variable 'network_connections' from source: task vars 11661 1726882385.97370: variable 'controller_profile' from source: play vars 11661 1726882385.97439: variable 'controller_profile' from source: play vars 11661 1726882385.97447: variable 'controller_device' from source: play vars 11661 1726882385.97518: variable 'controller_device' from source: play vars 11661 1726882385.97525: variable 'port1_profile' from source: play vars 11661 1726882385.97593: variable 'port1_profile' from source: play vars 11661 1726882385.97605: variable 'dhcp_interface1' from source: play vars 11661 1726882385.97671: variable 'dhcp_interface1' from source: play vars 11661 1726882385.97678: variable 'controller_profile' from source: play vars 11661 1726882385.97746: variable 'controller_profile' from source: play vars 11661 1726882385.97760: variable 'port2_profile' from source: play vars 11661 1726882385.97828: variable 'port2_profile' from source: play vars 11661 1726882385.97836: variable 'dhcp_interface2' from source: play vars 11661 1726882385.97909: variable 'dhcp_interface2' from source: play vars 11661 1726882385.97916: variable 'controller_profile' from source: play vars 11661 1726882385.97988: variable 'controller_profile' from source: play vars 11661 1726882385.98024: variable '__network_packages_default_wireless' from source: role '' defaults 11661 1726882385.98083: variable '__network_wireless_connections_defined' from source: role '' defaults 11661 1726882385.98290: variable 'network_connections' from source: task vars 11661 1726882385.98295: variable 'controller_profile' from source: play vars 11661 1726882385.98339: variable 'controller_profile' from source: play vars 11661 1726882385.98345: variable 'controller_device' from source: play vars 11661 1726882385.98397: variable 'controller_device' from source: play vars 11661 1726882385.98405: variable 'port1_profile' from source: play vars 11661 1726882385.98448: variable 'port1_profile' from source: play vars 11661 1726882385.98457: variable 'dhcp_interface1' from source: play vars 11661 1726882385.98504: variable 'dhcp_interface1' from source: play vars 11661 1726882385.98509: variable 'controller_profile' from source: play vars 11661 1726882385.98556: variable 'controller_profile' from source: play vars 11661 1726882385.98562: variable 'port2_profile' from source: play vars 11661 1726882385.98610: variable 'port2_profile' from source: play vars 11661 1726882385.98616: variable 'dhcp_interface2' from source: play vars 11661 1726882385.98666: variable 'dhcp_interface2' from source: play vars 11661 1726882385.98672: variable 'controller_profile' from source: play vars 11661 1726882385.98718: variable 'controller_profile' from source: play vars 11661 1726882385.98737: variable '__network_packages_default_team' from source: role '' defaults 11661 1726882385.98796: variable '__network_team_connections_defined' from source: role '' defaults 11661 1726882385.98987: variable 'network_connections' from source: task vars 11661 1726882385.98991: variable 'controller_profile' from source: play vars 11661 1726882385.99040: variable 'controller_profile' from source: play vars 11661 1726882385.99043: variable 'controller_device' from source: play vars 11661 1726882385.99092: variable 'controller_device' from source: play vars 11661 1726882385.99099: variable 'port1_profile' from source: play vars 11661 1726882385.99147: variable 'port1_profile' from source: play vars 11661 1726882385.99150: variable 'dhcp_interface1' from source: play vars 11661 1726882385.99199: variable 'dhcp_interface1' from source: play vars 11661 1726882385.99205: variable 'controller_profile' from source: play vars 11661 1726882385.99253: variable 'controller_profile' from source: play vars 11661 1726882385.99259: variable 'port2_profile' from source: play vars 11661 1726882385.99304: variable 'port2_profile' from source: play vars 11661 1726882385.99310: variable 'dhcp_interface2' from source: play vars 11661 1726882385.99361: variable 'dhcp_interface2' from source: play vars 11661 1726882385.99365: variable 'controller_profile' from source: play vars 11661 1726882385.99409: variable 'controller_profile' from source: play vars 11661 1726882385.99456: variable '__network_service_name_default_initscripts' from source: role '' defaults 11661 1726882385.99500: variable '__network_service_name_default_initscripts' from source: role '' defaults 11661 1726882385.99506: variable '__network_packages_default_initscripts' from source: role '' defaults 11661 1726882385.99547: variable '__network_packages_default_initscripts' from source: role '' defaults 11661 1726882385.99688: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11661 1726882385.99986: variable 'network_connections' from source: task vars 11661 1726882385.99990: variable 'controller_profile' from source: play vars 11661 1726882386.00031: variable 'controller_profile' from source: play vars 11661 1726882386.00039: variable 'controller_device' from source: play vars 11661 1726882386.00083: variable 'controller_device' from source: play vars 11661 1726882386.00090: variable 'port1_profile' from source: play vars 11661 1726882386.00133: variable 'port1_profile' from source: play vars 11661 1726882386.00138: variable 'dhcp_interface1' from source: play vars 11661 1726882386.00184: variable 'dhcp_interface1' from source: play vars 11661 1726882386.00189: variable 'controller_profile' from source: play vars 11661 1726882386.00233: variable 'controller_profile' from source: play vars 11661 1726882386.00239: variable 'port2_profile' from source: play vars 11661 1726882386.00285: variable 'port2_profile' from source: play vars 11661 1726882386.00291: variable 'dhcp_interface2' from source: play vars 11661 1726882386.00337: variable 'dhcp_interface2' from source: play vars 11661 1726882386.00340: variable 'controller_profile' from source: play vars 11661 1726882386.00384: variable 'controller_profile' from source: play vars 11661 1726882386.00390: variable 'ansible_distribution' from source: facts 11661 1726882386.00393: variable '__network_rh_distros' from source: role '' defaults 11661 1726882386.00398: variable 'ansible_distribution_major_version' from source: facts 11661 1726882386.00419: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11661 1726882386.00527: variable 'ansible_distribution' from source: facts 11661 1726882386.00531: variable '__network_rh_distros' from source: role '' defaults 11661 1726882386.00533: variable 'ansible_distribution_major_version' from source: facts 11661 1726882386.00542: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11661 1726882386.00649: variable 'ansible_distribution' from source: facts 11661 1726882386.00655: variable '__network_rh_distros' from source: role '' defaults 11661 1726882386.00664: variable 'ansible_distribution_major_version' from source: facts 11661 1726882386.00689: variable 'network_provider' from source: set_fact 11661 1726882386.00701: variable 'ansible_facts' from source: unknown 11661 1726882386.01182: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11661 1726882386.01185: when evaluation is False, skipping this task 11661 1726882386.01188: _execute() done 11661 1726882386.01190: dumping result to json 11661 1726882386.01192: done dumping result, returning 11661 1726882386.01203: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-896b-2321-00000000002e] 11661 1726882386.01206: sending task result for task 0e448fcc-3ce9-896b-2321-00000000002e 11661 1726882386.01295: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000002e 11661 1726882386.01298: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11661 1726882386.01348: no more pending results, returning what we have 11661 1726882386.01354: results queue empty 11661 1726882386.01355: checking for any_errors_fatal 11661 1726882386.01361: done checking for any_errors_fatal 11661 1726882386.01362: checking for max_fail_percentage 11661 1726882386.01365: done checking for max_fail_percentage 11661 1726882386.01366: checking to see if all hosts have failed and the running result is not ok 11661 1726882386.01367: done checking to see if all hosts have failed 11661 1726882386.01367: getting the remaining hosts for this loop 11661 1726882386.01369: done getting the remaining hosts for this loop 11661 1726882386.01372: getting the next task for host managed_node2 11661 1726882386.01379: done getting next task for host managed_node2 11661 1726882386.01383: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11661 1726882386.01385: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882386.01403: getting variables 11661 1726882386.01405: in VariableManager get_vars() 11661 1726882386.01446: Calling all_inventory to load vars for managed_node2 11661 1726882386.01449: Calling groups_inventory to load vars for managed_node2 11661 1726882386.01453: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882386.01462: Calling all_plugins_play to load vars for managed_node2 11661 1726882386.01466: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882386.01469: Calling groups_plugins_play to load vars for managed_node2 11661 1726882386.02344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882386.03878: done with get_vars() 11661 1726882386.03899: done getting variables 11661 1726882386.03944: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:33:06 -0400 (0:00:00.119) 0:00:14.753 ****** 11661 1726882386.03971: entering _queue_task() for managed_node2/package 11661 1726882386.04196: worker is 1 (out of 1 available) 11661 1726882386.04209: exiting _queue_task() for managed_node2/package 11661 1726882386.04221: done queuing things up, now waiting for results queue to drain 11661 1726882386.04222: waiting for pending results... 11661 1726882386.04395: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11661 1726882386.04483: in run() - task 0e448fcc-3ce9-896b-2321-00000000002f 11661 1726882386.04495: variable 'ansible_search_path' from source: unknown 11661 1726882386.04498: variable 'ansible_search_path' from source: unknown 11661 1726882386.04526: calling self._execute() 11661 1726882386.04594: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882386.04598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882386.04608: variable 'omit' from source: magic vars 11661 1726882386.04876: variable 'ansible_distribution_major_version' from source: facts 11661 1726882386.04887: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882386.04968: variable 'network_state' from source: role '' defaults 11661 1726882386.04976: Evaluated conditional (network_state != {}): False 11661 1726882386.04979: when evaluation is False, skipping this task 11661 1726882386.04982: _execute() done 11661 1726882386.04984: dumping result to json 11661 1726882386.04988: done dumping result, returning 11661 1726882386.04994: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-896b-2321-00000000002f] 11661 1726882386.05004: sending task result for task 0e448fcc-3ce9-896b-2321-00000000002f 11661 1726882386.05092: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000002f 11661 1726882386.05095: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11661 1726882386.05155: no more pending results, returning what we have 11661 1726882386.05159: results queue empty 11661 1726882386.05160: checking for any_errors_fatal 11661 1726882386.05169: done checking for any_errors_fatal 11661 1726882386.05169: checking for max_fail_percentage 11661 1726882386.05171: done checking for max_fail_percentage 11661 1726882386.05173: checking to see if all hosts have failed and the running result is not ok 11661 1726882386.05173: done checking to see if all hosts have failed 11661 1726882386.05174: getting the remaining hosts for this loop 11661 1726882386.05176: done getting the remaining hosts for this loop 11661 1726882386.05179: getting the next task for host managed_node2 11661 1726882386.05186: done getting next task for host managed_node2 11661 1726882386.05190: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11661 1726882386.05193: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882386.05210: getting variables 11661 1726882386.05212: in VariableManager get_vars() 11661 1726882386.05248: Calling all_inventory to load vars for managed_node2 11661 1726882386.05253: Calling groups_inventory to load vars for managed_node2 11661 1726882386.05255: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882386.05265: Calling all_plugins_play to load vars for managed_node2 11661 1726882386.05268: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882386.05271: Calling groups_plugins_play to load vars for managed_node2 11661 1726882386.10274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882386.11982: done with get_vars() 11661 1726882386.12013: done getting variables 11661 1726882386.12067: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:33:06 -0400 (0:00:00.081) 0:00:14.834 ****** 11661 1726882386.12103: entering _queue_task() for managed_node2/package 11661 1726882386.12416: worker is 1 (out of 1 available) 11661 1726882386.12428: exiting _queue_task() for managed_node2/package 11661 1726882386.12439: done queuing things up, now waiting for results queue to drain 11661 1726882386.12441: waiting for pending results... 11661 1726882386.12714: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11661 1726882386.12860: in run() - task 0e448fcc-3ce9-896b-2321-000000000030 11661 1726882386.12883: variable 'ansible_search_path' from source: unknown 11661 1726882386.12891: variable 'ansible_search_path' from source: unknown 11661 1726882386.12929: calling self._execute() 11661 1726882386.13030: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882386.13042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882386.13060: variable 'omit' from source: magic vars 11661 1726882386.13448: variable 'ansible_distribution_major_version' from source: facts 11661 1726882386.13471: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882386.13600: variable 'network_state' from source: role '' defaults 11661 1726882386.13616: Evaluated conditional (network_state != {}): False 11661 1726882386.13623: when evaluation is False, skipping this task 11661 1726882386.13629: _execute() done 11661 1726882386.13640: dumping result to json 11661 1726882386.13647: done dumping result, returning 11661 1726882386.13661: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-896b-2321-000000000030] 11661 1726882386.13673: sending task result for task 0e448fcc-3ce9-896b-2321-000000000030 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11661 1726882386.13828: no more pending results, returning what we have 11661 1726882386.13832: results queue empty 11661 1726882386.13833: checking for any_errors_fatal 11661 1726882386.13839: done checking for any_errors_fatal 11661 1726882386.13840: checking for max_fail_percentage 11661 1726882386.13842: done checking for max_fail_percentage 11661 1726882386.13843: checking to see if all hosts have failed and the running result is not ok 11661 1726882386.13844: done checking to see if all hosts have failed 11661 1726882386.13845: getting the remaining hosts for this loop 11661 1726882386.13847: done getting the remaining hosts for this loop 11661 1726882386.13854: getting the next task for host managed_node2 11661 1726882386.13862: done getting next task for host managed_node2 11661 1726882386.13869: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11661 1726882386.13877: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882386.13895: getting variables 11661 1726882386.13896: in VariableManager get_vars() 11661 1726882386.13938: Calling all_inventory to load vars for managed_node2 11661 1726882386.13941: Calling groups_inventory to load vars for managed_node2 11661 1726882386.13944: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882386.13958: Calling all_plugins_play to load vars for managed_node2 11661 1726882386.13961: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882386.13966: Calling groups_plugins_play to load vars for managed_node2 11661 1726882386.15086: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000030 11661 1726882386.15090: WORKER PROCESS EXITING 11661 1726882386.15682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882386.17437: done with get_vars() 11661 1726882386.17467: done getting variables 11661 1726882386.17569: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:33:06 -0400 (0:00:00.054) 0:00:14.889 ****** 11661 1726882386.17602: entering _queue_task() for managed_node2/service 11661 1726882386.17604: Creating lock for service 11661 1726882386.17917: worker is 1 (out of 1 available) 11661 1726882386.17930: exiting _queue_task() for managed_node2/service 11661 1726882386.17941: done queuing things up, now waiting for results queue to drain 11661 1726882386.17943: waiting for pending results... 11661 1726882386.18222: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11661 1726882386.18371: in run() - task 0e448fcc-3ce9-896b-2321-000000000031 11661 1726882386.18396: variable 'ansible_search_path' from source: unknown 11661 1726882386.18403: variable 'ansible_search_path' from source: unknown 11661 1726882386.18439: calling self._execute() 11661 1726882386.18535: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882386.18546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882386.18561: variable 'omit' from source: magic vars 11661 1726882386.18937: variable 'ansible_distribution_major_version' from source: facts 11661 1726882386.18957: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882386.19085: variable '__network_wireless_connections_defined' from source: role '' defaults 11661 1726882386.19292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882386.22594: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882386.22729: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882386.22821: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882386.22957: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882386.23062: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882386.23352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882386.23393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882386.23438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882386.23496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882386.23522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882386.23594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882386.23623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882386.23654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882386.23700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882386.23718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882386.23762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882386.23790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882386.23817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882386.23858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882386.23879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882386.24043: variable 'network_connections' from source: task vars 11661 1726882386.24062: variable 'controller_profile' from source: play vars 11661 1726882386.24141: variable 'controller_profile' from source: play vars 11661 1726882386.24157: variable 'controller_device' from source: play vars 11661 1726882386.24226: variable 'controller_device' from source: play vars 11661 1726882386.24241: variable 'port1_profile' from source: play vars 11661 1726882386.24314: variable 'port1_profile' from source: play vars 11661 1726882386.24325: variable 'dhcp_interface1' from source: play vars 11661 1726882386.24389: variable 'dhcp_interface1' from source: play vars 11661 1726882386.24400: variable 'controller_profile' from source: play vars 11661 1726882386.24459: variable 'controller_profile' from source: play vars 11661 1726882386.24472: variable 'port2_profile' from source: play vars 11661 1726882386.24536: variable 'port2_profile' from source: play vars 11661 1726882386.24549: variable 'dhcp_interface2' from source: play vars 11661 1726882386.24614: variable 'dhcp_interface2' from source: play vars 11661 1726882386.24630: variable 'controller_profile' from source: play vars 11661 1726882386.24695: variable 'controller_profile' from source: play vars 11661 1726882386.24779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11661 1726882386.24958: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11661 1726882386.25011: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11661 1726882386.25046: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11661 1726882386.25088: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11661 1726882386.25134: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11661 1726882386.25168: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11661 1726882386.25201: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882386.25231: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11661 1726882386.25311: variable '__network_team_connections_defined' from source: role '' defaults 11661 1726882386.26492: variable 'network_connections' from source: task vars 11661 1726882386.26502: variable 'controller_profile' from source: play vars 11661 1726882386.26576: variable 'controller_profile' from source: play vars 11661 1726882386.26683: variable 'controller_device' from source: play vars 11661 1726882386.26865: variable 'controller_device' from source: play vars 11661 1726882386.26880: variable 'port1_profile' from source: play vars 11661 1726882386.26942: variable 'port1_profile' from source: play vars 11661 1726882386.26981: variable 'dhcp_interface1' from source: play vars 11661 1726882386.27040: variable 'dhcp_interface1' from source: play vars 11661 1726882386.27057: variable 'controller_profile' from source: play vars 11661 1726882386.27124: variable 'controller_profile' from source: play vars 11661 1726882386.27137: variable 'port2_profile' from source: play vars 11661 1726882386.27197: variable 'port2_profile' from source: play vars 11661 1726882386.27208: variable 'dhcp_interface2' from source: play vars 11661 1726882386.27271: variable 'dhcp_interface2' from source: play vars 11661 1726882386.27287: variable 'controller_profile' from source: play vars 11661 1726882386.27352: variable 'controller_profile' from source: play vars 11661 1726882386.27396: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11661 1726882386.27405: when evaluation is False, skipping this task 11661 1726882386.27412: _execute() done 11661 1726882386.27418: dumping result to json 11661 1726882386.27425: done dumping result, returning 11661 1726882386.27437: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-896b-2321-000000000031] 11661 1726882386.27446: sending task result for task 0e448fcc-3ce9-896b-2321-000000000031 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11661 1726882386.27602: no more pending results, returning what we have 11661 1726882386.27605: results queue empty 11661 1726882386.27606: checking for any_errors_fatal 11661 1726882386.27613: done checking for any_errors_fatal 11661 1726882386.27613: checking for max_fail_percentage 11661 1726882386.27615: done checking for max_fail_percentage 11661 1726882386.27616: checking to see if all hosts have failed and the running result is not ok 11661 1726882386.27617: done checking to see if all hosts have failed 11661 1726882386.27618: getting the remaining hosts for this loop 11661 1726882386.27619: done getting the remaining hosts for this loop 11661 1726882386.27623: getting the next task for host managed_node2 11661 1726882386.27631: done getting next task for host managed_node2 11661 1726882386.27635: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11661 1726882386.27638: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882386.27655: getting variables 11661 1726882386.27657: in VariableManager get_vars() 11661 1726882386.27701: Calling all_inventory to load vars for managed_node2 11661 1726882386.27704: Calling groups_inventory to load vars for managed_node2 11661 1726882386.27707: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882386.27717: Calling all_plugins_play to load vars for managed_node2 11661 1726882386.27720: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882386.27723: Calling groups_plugins_play to load vars for managed_node2 11661 1726882386.29196: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000031 11661 1726882386.29200: WORKER PROCESS EXITING 11661 1726882386.30782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882386.33424: done with get_vars() 11661 1726882386.33456: done getting variables 11661 1726882386.33521: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:33:06 -0400 (0:00:00.159) 0:00:15.049 ****** 11661 1726882386.33560: entering _queue_task() for managed_node2/service 11661 1726882386.34586: worker is 1 (out of 1 available) 11661 1726882386.34599: exiting _queue_task() for managed_node2/service 11661 1726882386.34611: done queuing things up, now waiting for results queue to drain 11661 1726882386.34613: waiting for pending results... 11661 1726882386.35583: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11661 1726882386.36240: in run() - task 0e448fcc-3ce9-896b-2321-000000000032 11661 1726882386.36270: variable 'ansible_search_path' from source: unknown 11661 1726882386.36278: variable 'ansible_search_path' from source: unknown 11661 1726882386.36317: calling self._execute() 11661 1726882386.36408: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882386.36419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882386.36432: variable 'omit' from source: magic vars 11661 1726882386.36807: variable 'ansible_distribution_major_version' from source: facts 11661 1726882386.36824: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882386.36995: variable 'network_provider' from source: set_fact 11661 1726882386.37005: variable 'network_state' from source: role '' defaults 11661 1726882386.37022: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11661 1726882386.37032: variable 'omit' from source: magic vars 11661 1726882386.37096: variable 'omit' from source: magic vars 11661 1726882386.37133: variable 'network_service_name' from source: role '' defaults 11661 1726882386.37203: variable 'network_service_name' from source: role '' defaults 11661 1726882386.37317: variable '__network_provider_setup' from source: role '' defaults 11661 1726882386.37329: variable '__network_service_name_default_nm' from source: role '' defaults 11661 1726882386.37401: variable '__network_service_name_default_nm' from source: role '' defaults 11661 1726882386.37413: variable '__network_packages_default_nm' from source: role '' defaults 11661 1726882386.37486: variable '__network_packages_default_nm' from source: role '' defaults 11661 1726882386.37724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882386.40076: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882386.40165: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882386.40206: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882386.40242: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882386.40281: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882386.40362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882386.40399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882386.40427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882386.40480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882386.40497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882386.40543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882386.40575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882386.40606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882386.40652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882386.40674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882386.40913: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11661 1726882386.41037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882386.41072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882386.41102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882386.41152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882386.41173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882386.41272: variable 'ansible_python' from source: facts 11661 1726882386.41299: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11661 1726882386.41393: variable '__network_wpa_supplicant_required' from source: role '' defaults 11661 1726882386.41483: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11661 1726882386.41612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882386.41640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882386.41673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882386.41718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882386.41737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882386.41796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882386.41833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882386.41866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882386.41912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882386.41931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882386.42087: variable 'network_connections' from source: task vars 11661 1726882386.42098: variable 'controller_profile' from source: play vars 11661 1726882386.42184: variable 'controller_profile' from source: play vars 11661 1726882386.42201: variable 'controller_device' from source: play vars 11661 1726882386.42282: variable 'controller_device' from source: play vars 11661 1726882386.42299: variable 'port1_profile' from source: play vars 11661 1726882386.42381: variable 'port1_profile' from source: play vars 11661 1726882386.42396: variable 'dhcp_interface1' from source: play vars 11661 1726882386.42476: variable 'dhcp_interface1' from source: play vars 11661 1726882386.42491: variable 'controller_profile' from source: play vars 11661 1726882386.42572: variable 'controller_profile' from source: play vars 11661 1726882386.42588: variable 'port2_profile' from source: play vars 11661 1726882386.42668: variable 'port2_profile' from source: play vars 11661 1726882386.42686: variable 'dhcp_interface2' from source: play vars 11661 1726882386.42762: variable 'dhcp_interface2' from source: play vars 11661 1726882386.42780: variable 'controller_profile' from source: play vars 11661 1726882386.42854: variable 'controller_profile' from source: play vars 11661 1726882386.42948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11661 1726882386.43159: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11661 1726882386.43218: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11661 1726882386.43261: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11661 1726882386.43301: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11661 1726882386.43377: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11661 1726882386.43414: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11661 1726882386.43456: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882386.43495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11661 1726882386.43555: variable '__network_wireless_connections_defined' from source: role '' defaults 11661 1726882386.43976: variable 'network_connections' from source: task vars 11661 1726882386.44080: variable 'controller_profile' from source: play vars 11661 1726882386.44161: variable 'controller_profile' from source: play vars 11661 1726882386.44193: variable 'controller_device' from source: play vars 11661 1726882386.44343: variable 'controller_device' from source: play vars 11661 1726882386.44386: variable 'port1_profile' from source: play vars 11661 1726882386.44469: variable 'port1_profile' from source: play vars 11661 1726882386.44583: variable 'dhcp_interface1' from source: play vars 11661 1726882386.44775: variable 'dhcp_interface1' from source: play vars 11661 1726882386.44791: variable 'controller_profile' from source: play vars 11661 1726882386.44984: variable 'controller_profile' from source: play vars 11661 1726882386.45001: variable 'port2_profile' from source: play vars 11661 1726882386.45088: variable 'port2_profile' from source: play vars 11661 1726882386.45104: variable 'dhcp_interface2' from source: play vars 11661 1726882386.45196: variable 'dhcp_interface2' from source: play vars 11661 1726882386.45212: variable 'controller_profile' from source: play vars 11661 1726882386.45295: variable 'controller_profile' from source: play vars 11661 1726882386.45357: variable '__network_packages_default_wireless' from source: role '' defaults 11661 1726882386.45444: variable '__network_wireless_connections_defined' from source: role '' defaults 11661 1726882386.45768: variable 'network_connections' from source: task vars 11661 1726882386.45780: variable 'controller_profile' from source: play vars 11661 1726882386.45858: variable 'controller_profile' from source: play vars 11661 1726882386.45874: variable 'controller_device' from source: play vars 11661 1726882386.45949: variable 'controller_device' from source: play vars 11661 1726882386.45969: variable 'port1_profile' from source: play vars 11661 1726882386.46043: variable 'port1_profile' from source: play vars 11661 1726882386.46059: variable 'dhcp_interface1' from source: play vars 11661 1726882386.46134: variable 'dhcp_interface1' from source: play vars 11661 1726882386.46146: variable 'controller_profile' from source: play vars 11661 1726882386.46222: variable 'controller_profile' from source: play vars 11661 1726882386.46234: variable 'port2_profile' from source: play vars 11661 1726882386.46312: variable 'port2_profile' from source: play vars 11661 1726882386.46324: variable 'dhcp_interface2' from source: play vars 11661 1726882386.46404: variable 'dhcp_interface2' from source: play vars 11661 1726882386.46416: variable 'controller_profile' from source: play vars 11661 1726882386.46492: variable 'controller_profile' from source: play vars 11661 1726882386.46523: variable '__network_packages_default_team' from source: role '' defaults 11661 1726882386.46612: variable '__network_team_connections_defined' from source: role '' defaults 11661 1726882386.47148: variable 'network_connections' from source: task vars 11661 1726882386.47165: variable 'controller_profile' from source: play vars 11661 1726882386.47241: variable 'controller_profile' from source: play vars 11661 1726882386.47256: variable 'controller_device' from source: play vars 11661 1726882386.47327: variable 'controller_device' from source: play vars 11661 1726882386.47345: variable 'port1_profile' from source: play vars 11661 1726882386.47420: variable 'port1_profile' from source: play vars 11661 1726882386.47482: variable 'dhcp_interface1' from source: play vars 11661 1726882386.47624: variable 'dhcp_interface1' from source: play vars 11661 1726882386.47673: variable 'controller_profile' from source: play vars 11661 1726882386.47741: variable 'controller_profile' from source: play vars 11661 1726882386.47889: variable 'port2_profile' from source: play vars 11661 1726882386.47963: variable 'port2_profile' from source: play vars 11661 1726882386.47978: variable 'dhcp_interface2' from source: play vars 11661 1726882386.48166: variable 'dhcp_interface2' from source: play vars 11661 1726882386.48178: variable 'controller_profile' from source: play vars 11661 1726882386.48252: variable 'controller_profile' from source: play vars 11661 1726882386.48441: variable '__network_service_name_default_initscripts' from source: role '' defaults 11661 1726882386.48598: variable '__network_service_name_default_initscripts' from source: role '' defaults 11661 1726882386.48649: variable '__network_packages_default_initscripts' from source: role '' defaults 11661 1726882386.48720: variable '__network_packages_default_initscripts' from source: role '' defaults 11661 1726882386.49131: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11661 1726882386.49682: variable 'network_connections' from source: task vars 11661 1726882386.49692: variable 'controller_profile' from source: play vars 11661 1726882386.49755: variable 'controller_profile' from source: play vars 11661 1726882386.49769: variable 'controller_device' from source: play vars 11661 1726882386.49824: variable 'controller_device' from source: play vars 11661 1726882386.49842: variable 'port1_profile' from source: play vars 11661 1726882386.49903: variable 'port1_profile' from source: play vars 11661 1726882386.49914: variable 'dhcp_interface1' from source: play vars 11661 1726882386.49978: variable 'dhcp_interface1' from source: play vars 11661 1726882386.49989: variable 'controller_profile' from source: play vars 11661 1726882386.50043: variable 'controller_profile' from source: play vars 11661 1726882386.50062: variable 'port2_profile' from source: play vars 11661 1726882386.50121: variable 'port2_profile' from source: play vars 11661 1726882386.50132: variable 'dhcp_interface2' from source: play vars 11661 1726882386.50201: variable 'dhcp_interface2' from source: play vars 11661 1726882386.50215: variable 'controller_profile' from source: play vars 11661 1726882386.50284: variable 'controller_profile' from source: play vars 11661 1726882386.50299: variable 'ansible_distribution' from source: facts 11661 1726882386.50308: variable '__network_rh_distros' from source: role '' defaults 11661 1726882386.50318: variable 'ansible_distribution_major_version' from source: facts 11661 1726882386.50353: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11661 1726882386.50532: variable 'ansible_distribution' from source: facts 11661 1726882386.50540: variable '__network_rh_distros' from source: role '' defaults 11661 1726882386.50549: variable 'ansible_distribution_major_version' from source: facts 11661 1726882386.50570: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11661 1726882386.50731: variable 'ansible_distribution' from source: facts 11661 1726882386.50739: variable '__network_rh_distros' from source: role '' defaults 11661 1726882386.50749: variable 'ansible_distribution_major_version' from source: facts 11661 1726882386.50790: variable 'network_provider' from source: set_fact 11661 1726882386.50820: variable 'omit' from source: magic vars 11661 1726882386.50855: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882386.50890: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882386.50914: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882386.50939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882386.50956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882386.50990: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882386.50998: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882386.51005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882386.51103: Set connection var ansible_connection to ssh 11661 1726882386.51118: Set connection var ansible_pipelining to False 11661 1726882386.51135: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882386.51153: Set connection var ansible_timeout to 10 11661 1726882386.51160: Set connection var ansible_shell_type to sh 11661 1726882386.51176: Set connection var ansible_shell_executable to /bin/sh 11661 1726882386.51203: variable 'ansible_shell_executable' from source: unknown 11661 1726882386.51217: variable 'ansible_connection' from source: unknown 11661 1726882386.51237: variable 'ansible_module_compression' from source: unknown 11661 1726882386.51330: variable 'ansible_shell_type' from source: unknown 11661 1726882386.51338: variable 'ansible_shell_executable' from source: unknown 11661 1726882386.51345: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882386.51356: variable 'ansible_pipelining' from source: unknown 11661 1726882386.51363: variable 'ansible_timeout' from source: unknown 11661 1726882386.51374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882386.51508: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882386.51522: variable 'omit' from source: magic vars 11661 1726882386.51530: starting attempt loop 11661 1726882386.51537: running the handler 11661 1726882386.51634: variable 'ansible_facts' from source: unknown 11661 1726882386.52598: _low_level_execute_command(): starting 11661 1726882386.52609: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882386.53414: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882386.53542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882386.53565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882386.53586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882386.53636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882386.53679: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882386.53696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882386.53716: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882386.53729: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882386.53745: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882386.53771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882386.53788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882386.53878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882386.53894: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882386.53908: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882386.53923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882386.54101: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882386.54123: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882386.54139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882386.54282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882386.55960: stdout chunk (state=3): >>>/root <<< 11661 1726882386.56168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882386.56172: stdout chunk (state=3): >>><<< 11661 1726882386.56178: stderr chunk (state=3): >>><<< 11661 1726882386.56302: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882386.56305: _low_level_execute_command(): starting 11661 1726882386.56308: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882386.5620055-12376-53894603096504 `" && echo ansible-tmp-1726882386.5620055-12376-53894603096504="` echo /root/.ansible/tmp/ansible-tmp-1726882386.5620055-12376-53894603096504 `" ) && sleep 0' 11661 1726882386.58044: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882386.58065: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882386.58082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882386.58388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882386.58434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882386.58447: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882386.58465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882386.58511: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882386.58525: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882386.58531: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882386.58539: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882386.58548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882386.58565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882386.58573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882386.58580: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882386.58589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882386.58684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882386.58702: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882386.58735: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882386.58948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882386.60837: stdout chunk (state=3): >>>ansible-tmp-1726882386.5620055-12376-53894603096504=/root/.ansible/tmp/ansible-tmp-1726882386.5620055-12376-53894603096504 <<< 11661 1726882386.60948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882386.61129: stderr chunk (state=3): >>><<< 11661 1726882386.61132: stdout chunk (state=3): >>><<< 11661 1726882386.61516: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882386.5620055-12376-53894603096504=/root/.ansible/tmp/ansible-tmp-1726882386.5620055-12376-53894603096504 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882386.61520: variable 'ansible_module_compression' from source: unknown 11661 1726882386.61523: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 11661 1726882386.61525: ANSIBALLZ: Acquiring lock 11661 1726882386.61527: ANSIBALLZ: Lock acquired: 139652576276224 11661 1726882386.61529: ANSIBALLZ: Creating module 11661 1726882386.84713: ANSIBALLZ: Writing module into payload 11661 1726882386.84853: ANSIBALLZ: Writing module 11661 1726882386.84883: ANSIBALLZ: Renaming module 11661 1726882386.84888: ANSIBALLZ: Done creating module 11661 1726882386.84923: variable 'ansible_facts' from source: unknown 11661 1726882386.85062: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882386.5620055-12376-53894603096504/AnsiballZ_systemd.py 11661 1726882386.85181: Sending initial data 11661 1726882386.85185: Sent initial data (155 bytes) 11661 1726882386.85892: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882386.85907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882386.85929: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882386.85941: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882386.85997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882386.86007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882386.86133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882386.87987: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882386.88087: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882386.88186: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpi1tiwmbw /root/.ansible/tmp/ansible-tmp-1726882386.5620055-12376-53894603096504/AnsiballZ_systemd.py <<< 11661 1726882386.88284: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882386.90380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882386.90485: stderr chunk (state=3): >>><<< 11661 1726882386.90491: stdout chunk (state=3): >>><<< 11661 1726882386.90514: done transferring module to remote 11661 1726882386.90535: _low_level_execute_command(): starting 11661 1726882386.90540: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882386.5620055-12376-53894603096504/ /root/.ansible/tmp/ansible-tmp-1726882386.5620055-12376-53894603096504/AnsiballZ_systemd.py && sleep 0' 11661 1726882386.91210: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882386.91220: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882386.91231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882386.91247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882386.91288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882386.91302: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882386.91316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882386.91330: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882386.91338: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882386.91345: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882386.91355: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882386.91362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882386.91375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882386.91384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882386.91389: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882386.91401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882386.91496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882386.91504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882386.91508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882386.91636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882386.93452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882386.93502: stderr chunk (state=3): >>><<< 11661 1726882386.93506: stdout chunk (state=3): >>><<< 11661 1726882386.93519: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882386.93522: _low_level_execute_command(): starting 11661 1726882386.93527: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882386.5620055-12376-53894603096504/AnsiballZ_systemd.py && sleep 0' 11661 1726882386.93973: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882386.93979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882386.94035: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882386.94038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882386.94040: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 11661 1726882386.94042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882386.94044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882386.94046: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882386.94102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882386.94105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882386.94112: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882386.94217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882387.19361: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "8339456", "MemoryAvailable": "infinity", "CPUUsageNSec": "425030000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "<<< 11661 1726882387.19388: stdout chunk (state=3): >>>0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11661 1726882387.20876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882387.20935: stderr chunk (state=3): >>><<< 11661 1726882387.20939: stdout chunk (state=3): >>><<< 11661 1726882387.20959: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "8339456", "MemoryAvailable": "infinity", "CPUUsageNSec": "425030000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882387.21076: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882386.5620055-12376-53894603096504/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882387.21092: _low_level_execute_command(): starting 11661 1726882387.21096: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882386.5620055-12376-53894603096504/ > /dev/null 2>&1 && sleep 0' 11661 1726882387.21552: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882387.21572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882387.21598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882387.21611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882387.21621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882387.21674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882387.21688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882387.21796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882387.23600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882387.23654: stderr chunk (state=3): >>><<< 11661 1726882387.23657: stdout chunk (state=3): >>><<< 11661 1726882387.23669: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882387.23675: handler run complete 11661 1726882387.23712: attempt loop complete, returning result 11661 1726882387.23719: _execute() done 11661 1726882387.23721: dumping result to json 11661 1726882387.23733: done dumping result, returning 11661 1726882387.23741: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-896b-2321-000000000032] 11661 1726882387.23748: sending task result for task 0e448fcc-3ce9-896b-2321-000000000032 11661 1726882387.23959: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000032 11661 1726882387.23962: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11661 1726882387.24014: no more pending results, returning what we have 11661 1726882387.24017: results queue empty 11661 1726882387.24018: checking for any_errors_fatal 11661 1726882387.24022: done checking for any_errors_fatal 11661 1726882387.24022: checking for max_fail_percentage 11661 1726882387.24024: done checking for max_fail_percentage 11661 1726882387.24025: checking to see if all hosts have failed and the running result is not ok 11661 1726882387.24025: done checking to see if all hosts have failed 11661 1726882387.24026: getting the remaining hosts for this loop 11661 1726882387.24028: done getting the remaining hosts for this loop 11661 1726882387.24031: getting the next task for host managed_node2 11661 1726882387.24037: done getting next task for host managed_node2 11661 1726882387.24041: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11661 1726882387.24044: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882387.24056: getting variables 11661 1726882387.24058: in VariableManager get_vars() 11661 1726882387.24096: Calling all_inventory to load vars for managed_node2 11661 1726882387.24098: Calling groups_inventory to load vars for managed_node2 11661 1726882387.24100: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882387.24109: Calling all_plugins_play to load vars for managed_node2 11661 1726882387.24111: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882387.24114: Calling groups_plugins_play to load vars for managed_node2 11661 1726882387.25013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882387.25979: done with get_vars() 11661 1726882387.25996: done getting variables 11661 1726882387.26040: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:33:07 -0400 (0:00:00.925) 0:00:15.974 ****** 11661 1726882387.26069: entering _queue_task() for managed_node2/service 11661 1726882387.26297: worker is 1 (out of 1 available) 11661 1726882387.26310: exiting _queue_task() for managed_node2/service 11661 1726882387.26321: done queuing things up, now waiting for results queue to drain 11661 1726882387.26323: waiting for pending results... 11661 1726882387.26504: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11661 1726882387.26589: in run() - task 0e448fcc-3ce9-896b-2321-000000000033 11661 1726882387.26604: variable 'ansible_search_path' from source: unknown 11661 1726882387.26607: variable 'ansible_search_path' from source: unknown 11661 1726882387.26639: calling self._execute() 11661 1726882387.26713: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882387.26717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882387.26729: variable 'omit' from source: magic vars 11661 1726882387.27006: variable 'ansible_distribution_major_version' from source: facts 11661 1726882387.27018: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882387.27102: variable 'network_provider' from source: set_fact 11661 1726882387.27106: Evaluated conditional (network_provider == "nm"): True 11661 1726882387.27173: variable '__network_wpa_supplicant_required' from source: role '' defaults 11661 1726882387.27232: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11661 1726882387.27353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882387.28878: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882387.28923: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882387.28950: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882387.28981: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882387.29000: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882387.29070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882387.29092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882387.29111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882387.29139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882387.29150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882387.29185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882387.29203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882387.29219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882387.29246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882387.29259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882387.29288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882387.29304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882387.29323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882387.29352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882387.29363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882387.29468: variable 'network_connections' from source: task vars 11661 1726882387.29478: variable 'controller_profile' from source: play vars 11661 1726882387.29528: variable 'controller_profile' from source: play vars 11661 1726882387.29536: variable 'controller_device' from source: play vars 11661 1726882387.29581: variable 'controller_device' from source: play vars 11661 1726882387.29590: variable 'port1_profile' from source: play vars 11661 1726882387.29630: variable 'port1_profile' from source: play vars 11661 1726882387.29641: variable 'dhcp_interface1' from source: play vars 11661 1726882387.29691: variable 'dhcp_interface1' from source: play vars 11661 1726882387.29698: variable 'controller_profile' from source: play vars 11661 1726882387.29740: variable 'controller_profile' from source: play vars 11661 1726882387.29752: variable 'port2_profile' from source: play vars 11661 1726882387.29795: variable 'port2_profile' from source: play vars 11661 1726882387.29801: variable 'dhcp_interface2' from source: play vars 11661 1726882387.29845: variable 'dhcp_interface2' from source: play vars 11661 1726882387.29855: variable 'controller_profile' from source: play vars 11661 1726882387.29900: variable 'controller_profile' from source: play vars 11661 1726882387.29952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11661 1726882387.30074: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11661 1726882387.30101: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11661 1726882387.30123: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11661 1726882387.30145: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11661 1726882387.30180: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11661 1726882387.30195: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11661 1726882387.30214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882387.30238: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11661 1726882387.30280: variable '__network_wireless_connections_defined' from source: role '' defaults 11661 1726882387.30450: variable 'network_connections' from source: task vars 11661 1726882387.30458: variable 'controller_profile' from source: play vars 11661 1726882387.30503: variable 'controller_profile' from source: play vars 11661 1726882387.30509: variable 'controller_device' from source: play vars 11661 1726882387.30551: variable 'controller_device' from source: play vars 11661 1726882387.30561: variable 'port1_profile' from source: play vars 11661 1726882387.30606: variable 'port1_profile' from source: play vars 11661 1726882387.30609: variable 'dhcp_interface1' from source: play vars 11661 1726882387.30652: variable 'dhcp_interface1' from source: play vars 11661 1726882387.30662: variable 'controller_profile' from source: play vars 11661 1726882387.30709: variable 'controller_profile' from source: play vars 11661 1726882387.30719: variable 'port2_profile' from source: play vars 11661 1726882387.30766: variable 'port2_profile' from source: play vars 11661 1726882387.30772: variable 'dhcp_interface2' from source: play vars 11661 1726882387.30814: variable 'dhcp_interface2' from source: play vars 11661 1726882387.30824: variable 'controller_profile' from source: play vars 11661 1726882387.30869: variable 'controller_profile' from source: play vars 11661 1726882387.30899: Evaluated conditional (__network_wpa_supplicant_required): False 11661 1726882387.30902: when evaluation is False, skipping this task 11661 1726882387.30905: _execute() done 11661 1726882387.30907: dumping result to json 11661 1726882387.30911: done dumping result, returning 11661 1726882387.30918: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-896b-2321-000000000033] 11661 1726882387.30923: sending task result for task 0e448fcc-3ce9-896b-2321-000000000033 11661 1726882387.31014: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000033 11661 1726882387.31017: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11661 1726882387.31068: no more pending results, returning what we have 11661 1726882387.31072: results queue empty 11661 1726882387.31073: checking for any_errors_fatal 11661 1726882387.31097: done checking for any_errors_fatal 11661 1726882387.31098: checking for max_fail_percentage 11661 1726882387.31100: done checking for max_fail_percentage 11661 1726882387.31101: checking to see if all hosts have failed and the running result is not ok 11661 1726882387.31101: done checking to see if all hosts have failed 11661 1726882387.31102: getting the remaining hosts for this loop 11661 1726882387.31104: done getting the remaining hosts for this loop 11661 1726882387.31107: getting the next task for host managed_node2 11661 1726882387.31114: done getting next task for host managed_node2 11661 1726882387.31119: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11661 1726882387.31122: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882387.31135: getting variables 11661 1726882387.31137: in VariableManager get_vars() 11661 1726882387.31185: Calling all_inventory to load vars for managed_node2 11661 1726882387.31189: Calling groups_inventory to load vars for managed_node2 11661 1726882387.31191: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882387.31199: Calling all_plugins_play to load vars for managed_node2 11661 1726882387.31202: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882387.31204: Calling groups_plugins_play to load vars for managed_node2 11661 1726882387.32016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882387.32963: done with get_vars() 11661 1726882387.32982: done getting variables 11661 1726882387.33027: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:33:07 -0400 (0:00:00.069) 0:00:16.043 ****** 11661 1726882387.33052: entering _queue_task() for managed_node2/service 11661 1726882387.33284: worker is 1 (out of 1 available) 11661 1726882387.33298: exiting _queue_task() for managed_node2/service 11661 1726882387.33309: done queuing things up, now waiting for results queue to drain 11661 1726882387.33311: waiting for pending results... 11661 1726882387.33488: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 11661 1726882387.33575: in run() - task 0e448fcc-3ce9-896b-2321-000000000034 11661 1726882387.33587: variable 'ansible_search_path' from source: unknown 11661 1726882387.33590: variable 'ansible_search_path' from source: unknown 11661 1726882387.33620: calling self._execute() 11661 1726882387.33694: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882387.33698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882387.33707: variable 'omit' from source: magic vars 11661 1726882387.33977: variable 'ansible_distribution_major_version' from source: facts 11661 1726882387.33987: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882387.34066: variable 'network_provider' from source: set_fact 11661 1726882387.34071: Evaluated conditional (network_provider == "initscripts"): False 11661 1726882387.34079: when evaluation is False, skipping this task 11661 1726882387.34082: _execute() done 11661 1726882387.34084: dumping result to json 11661 1726882387.34087: done dumping result, returning 11661 1726882387.34094: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-896b-2321-000000000034] 11661 1726882387.34099: sending task result for task 0e448fcc-3ce9-896b-2321-000000000034 11661 1726882387.34187: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000034 11661 1726882387.34190: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11661 1726882387.34238: no more pending results, returning what we have 11661 1726882387.34242: results queue empty 11661 1726882387.34242: checking for any_errors_fatal 11661 1726882387.34252: done checking for any_errors_fatal 11661 1726882387.34253: checking for max_fail_percentage 11661 1726882387.34255: done checking for max_fail_percentage 11661 1726882387.34256: checking to see if all hosts have failed and the running result is not ok 11661 1726882387.34257: done checking to see if all hosts have failed 11661 1726882387.34257: getting the remaining hosts for this loop 11661 1726882387.34260: done getting the remaining hosts for this loop 11661 1726882387.34263: getting the next task for host managed_node2 11661 1726882387.34271: done getting next task for host managed_node2 11661 1726882387.34275: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11661 1726882387.34278: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882387.34291: getting variables 11661 1726882387.34293: in VariableManager get_vars() 11661 1726882387.34333: Calling all_inventory to load vars for managed_node2 11661 1726882387.34336: Calling groups_inventory to load vars for managed_node2 11661 1726882387.34338: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882387.34346: Calling all_plugins_play to load vars for managed_node2 11661 1726882387.34348: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882387.34354: Calling groups_plugins_play to load vars for managed_node2 11661 1726882387.35252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882387.36190: done with get_vars() 11661 1726882387.36204: done getting variables 11661 1726882387.36247: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:33:07 -0400 (0:00:00.032) 0:00:16.076 ****** 11661 1726882387.36276: entering _queue_task() for managed_node2/copy 11661 1726882387.36495: worker is 1 (out of 1 available) 11661 1726882387.36510: exiting _queue_task() for managed_node2/copy 11661 1726882387.36521: done queuing things up, now waiting for results queue to drain 11661 1726882387.36523: waiting for pending results... 11661 1726882387.36701: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11661 1726882387.36783: in run() - task 0e448fcc-3ce9-896b-2321-000000000035 11661 1726882387.36793: variable 'ansible_search_path' from source: unknown 11661 1726882387.36797: variable 'ansible_search_path' from source: unknown 11661 1726882387.36828: calling self._execute() 11661 1726882387.36898: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882387.36901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882387.36910: variable 'omit' from source: magic vars 11661 1726882387.37173: variable 'ansible_distribution_major_version' from source: facts 11661 1726882387.37184: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882387.37258: variable 'network_provider' from source: set_fact 11661 1726882387.37270: Evaluated conditional (network_provider == "initscripts"): False 11661 1726882387.37274: when evaluation is False, skipping this task 11661 1726882387.37276: _execute() done 11661 1726882387.37279: dumping result to json 11661 1726882387.37282: done dumping result, returning 11661 1726882387.37290: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-896b-2321-000000000035] 11661 1726882387.37295: sending task result for task 0e448fcc-3ce9-896b-2321-000000000035 11661 1726882387.37380: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000035 11661 1726882387.37383: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11661 1726882387.37435: no more pending results, returning what we have 11661 1726882387.37438: results queue empty 11661 1726882387.37439: checking for any_errors_fatal 11661 1726882387.37445: done checking for any_errors_fatal 11661 1726882387.37446: checking for max_fail_percentage 11661 1726882387.37448: done checking for max_fail_percentage 11661 1726882387.37449: checking to see if all hosts have failed and the running result is not ok 11661 1726882387.37452: done checking to see if all hosts have failed 11661 1726882387.37453: getting the remaining hosts for this loop 11661 1726882387.37454: done getting the remaining hosts for this loop 11661 1726882387.37457: getting the next task for host managed_node2 11661 1726882387.37467: done getting next task for host managed_node2 11661 1726882387.37471: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11661 1726882387.37473: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882387.37486: getting variables 11661 1726882387.37487: in VariableManager get_vars() 11661 1726882387.37526: Calling all_inventory to load vars for managed_node2 11661 1726882387.37529: Calling groups_inventory to load vars for managed_node2 11661 1726882387.37531: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882387.37539: Calling all_plugins_play to load vars for managed_node2 11661 1726882387.37541: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882387.37543: Calling groups_plugins_play to load vars for managed_node2 11661 1726882387.38324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882387.39374: done with get_vars() 11661 1726882387.39389: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:33:07 -0400 (0:00:00.031) 0:00:16.107 ****** 11661 1726882387.39453: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 11661 1726882387.39454: Creating lock for fedora.linux_system_roles.network_connections 11661 1726882387.39683: worker is 1 (out of 1 available) 11661 1726882387.39696: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 11661 1726882387.39707: done queuing things up, now waiting for results queue to drain 11661 1726882387.39709: waiting for pending results... 11661 1726882387.39878: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11661 1726882387.39965: in run() - task 0e448fcc-3ce9-896b-2321-000000000036 11661 1726882387.39977: variable 'ansible_search_path' from source: unknown 11661 1726882387.39980: variable 'ansible_search_path' from source: unknown 11661 1726882387.40010: calling self._execute() 11661 1726882387.40075: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882387.40079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882387.40088: variable 'omit' from source: magic vars 11661 1726882387.40349: variable 'ansible_distribution_major_version' from source: facts 11661 1726882387.40360: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882387.40364: variable 'omit' from source: magic vars 11661 1726882387.40400: variable 'omit' from source: magic vars 11661 1726882387.40508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882387.41980: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882387.42024: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882387.42052: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882387.42080: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882387.42101: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882387.42155: variable 'network_provider' from source: set_fact 11661 1726882387.42243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882387.42277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882387.42296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882387.42322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882387.42334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882387.42387: variable 'omit' from source: magic vars 11661 1726882387.42464: variable 'omit' from source: magic vars 11661 1726882387.42533: variable 'network_connections' from source: task vars 11661 1726882387.42542: variable 'controller_profile' from source: play vars 11661 1726882387.42587: variable 'controller_profile' from source: play vars 11661 1726882387.42593: variable 'controller_device' from source: play vars 11661 1726882387.42637: variable 'controller_device' from source: play vars 11661 1726882387.42645: variable 'port1_profile' from source: play vars 11661 1726882387.42692: variable 'port1_profile' from source: play vars 11661 1726882387.42698: variable 'dhcp_interface1' from source: play vars 11661 1726882387.42742: variable 'dhcp_interface1' from source: play vars 11661 1726882387.42747: variable 'controller_profile' from source: play vars 11661 1726882387.42793: variable 'controller_profile' from source: play vars 11661 1726882387.42799: variable 'port2_profile' from source: play vars 11661 1726882387.42841: variable 'port2_profile' from source: play vars 11661 1726882387.42848: variable 'dhcp_interface2' from source: play vars 11661 1726882387.42892: variable 'dhcp_interface2' from source: play vars 11661 1726882387.42897: variable 'controller_profile' from source: play vars 11661 1726882387.42938: variable 'controller_profile' from source: play vars 11661 1726882387.43077: variable 'omit' from source: magic vars 11661 1726882387.43085: variable '__lsr_ansible_managed' from source: task vars 11661 1726882387.43126: variable '__lsr_ansible_managed' from source: task vars 11661 1726882387.43248: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11661 1726882387.43394: Loaded config def from plugin (lookup/template) 11661 1726882387.43397: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11661 1726882387.43419: File lookup term: get_ansible_managed.j2 11661 1726882387.43422: variable 'ansible_search_path' from source: unknown 11661 1726882387.43425: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11661 1726882387.43436: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11661 1726882387.43448: variable 'ansible_search_path' from source: unknown 11661 1726882387.46964: variable 'ansible_managed' from source: unknown 11661 1726882387.47050: variable 'omit' from source: magic vars 11661 1726882387.47077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882387.47097: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882387.47111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882387.47123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882387.47131: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882387.47155: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882387.47158: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882387.47161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882387.47225: Set connection var ansible_connection to ssh 11661 1726882387.47229: Set connection var ansible_pipelining to False 11661 1726882387.47235: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882387.47241: Set connection var ansible_timeout to 10 11661 1726882387.47244: Set connection var ansible_shell_type to sh 11661 1726882387.47259: Set connection var ansible_shell_executable to /bin/sh 11661 1726882387.47272: variable 'ansible_shell_executable' from source: unknown 11661 1726882387.47280: variable 'ansible_connection' from source: unknown 11661 1726882387.47282: variable 'ansible_module_compression' from source: unknown 11661 1726882387.47284: variable 'ansible_shell_type' from source: unknown 11661 1726882387.47287: variable 'ansible_shell_executable' from source: unknown 11661 1726882387.47289: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882387.47292: variable 'ansible_pipelining' from source: unknown 11661 1726882387.47295: variable 'ansible_timeout' from source: unknown 11661 1726882387.47298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882387.47388: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11661 1726882387.47392: variable 'omit' from source: magic vars 11661 1726882387.47399: starting attempt loop 11661 1726882387.47401: running the handler 11661 1726882387.47412: _low_level_execute_command(): starting 11661 1726882387.47419: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882387.47922: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882387.47941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882387.47959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882387.47975: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882387.48020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882387.48032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882387.48151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882387.49843: stdout chunk (state=3): >>>/root <<< 11661 1726882387.49940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882387.49999: stderr chunk (state=3): >>><<< 11661 1726882387.50006: stdout chunk (state=3): >>><<< 11661 1726882387.50023: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882387.50033: _low_level_execute_command(): starting 11661 1726882387.50039: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882387.5002337-12408-130850815313422 `" && echo ansible-tmp-1726882387.5002337-12408-130850815313422="` echo /root/.ansible/tmp/ansible-tmp-1726882387.5002337-12408-130850815313422 `" ) && sleep 0' 11661 1726882387.50493: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882387.50505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882387.50522: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882387.50534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882387.50584: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882387.50605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882387.50705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882387.52658: stdout chunk (state=3): >>>ansible-tmp-1726882387.5002337-12408-130850815313422=/root/.ansible/tmp/ansible-tmp-1726882387.5002337-12408-130850815313422 <<< 11661 1726882387.52767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882387.52820: stderr chunk (state=3): >>><<< 11661 1726882387.52823: stdout chunk (state=3): >>><<< 11661 1726882387.52839: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882387.5002337-12408-130850815313422=/root/.ansible/tmp/ansible-tmp-1726882387.5002337-12408-130850815313422 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882387.52884: variable 'ansible_module_compression' from source: unknown 11661 1726882387.52926: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 11661 1726882387.52929: ANSIBALLZ: Acquiring lock 11661 1726882387.52931: ANSIBALLZ: Lock acquired: 139652580037008 11661 1726882387.52934: ANSIBALLZ: Creating module 11661 1726882387.66344: ANSIBALLZ: Writing module into payload 11661 1726882387.66688: ANSIBALLZ: Writing module 11661 1726882387.66712: ANSIBALLZ: Renaming module 11661 1726882387.66718: ANSIBALLZ: Done creating module 11661 1726882387.66739: variable 'ansible_facts' from source: unknown 11661 1726882387.66809: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882387.5002337-12408-130850815313422/AnsiballZ_network_connections.py 11661 1726882387.66920: Sending initial data 11661 1726882387.66929: Sent initial data (168 bytes) 11661 1726882387.67632: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882387.67645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882387.67661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882387.67674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882387.67723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882387.67744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882387.67860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882387.69689: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 11661 1726882387.69694: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882387.69785: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882387.69885: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpog7fukie /root/.ansible/tmp/ansible-tmp-1726882387.5002337-12408-130850815313422/AnsiballZ_network_connections.py <<< 11661 1726882387.69989: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882387.71508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882387.71634: stderr chunk (state=3): >>><<< 11661 1726882387.71637: stdout chunk (state=3): >>><<< 11661 1726882387.71660: done transferring module to remote 11661 1726882387.71676: _low_level_execute_command(): starting 11661 1726882387.71681: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882387.5002337-12408-130850815313422/ /root/.ansible/tmp/ansible-tmp-1726882387.5002337-12408-130850815313422/AnsiballZ_network_connections.py && sleep 0' 11661 1726882387.72128: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882387.72134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882387.72180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882387.72183: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882387.72186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882387.72242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882387.72245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882387.72347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882387.74186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882387.74225: stderr chunk (state=3): >>><<< 11661 1726882387.74228: stdout chunk (state=3): >>><<< 11661 1726882387.74245: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882387.74248: _low_level_execute_command(): starting 11661 1726882387.74254: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882387.5002337-12408-130850815313422/AnsiballZ_network_connections.py && sleep 0' 11661 1726882387.74902: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882387.74910: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882387.74921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882387.74936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882387.74981: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882387.74988: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882387.74998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882387.75012: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882387.75019: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882387.75026: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882387.75033: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882387.75043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882387.75055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882387.75068: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882387.75075: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882387.75085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882387.75171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882387.75176: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882387.75179: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882387.75327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882388.16420: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 6479a90a-8c97-4625-ab42-bedea857a927\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 6d0f0ab8-7234-4420-a06e-101a4b2a8221\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1c852c52-21d3-4d36-862f-c83a68bc1805\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 6479a90a-8c97-4625-ab42-bedea857a927 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 6d0f0ab8-7234-4420-a06e-101a4b2a8221 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1c852c52-21d3-4d36-862f-c83a68bc1805 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11661 1726882388.18756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882388.18809: stderr chunk (state=3): >>><<< 11661 1726882388.18813: stdout chunk (state=3): >>><<< 11661 1726882388.18828: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 6479a90a-8c97-4625-ab42-bedea857a927\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 6d0f0ab8-7234-4420-a06e-101a4b2a8221\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1c852c52-21d3-4d36-862f-c83a68bc1805\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 6479a90a-8c97-4625-ab42-bedea857a927 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 6d0f0ab8-7234-4420-a06e-101a4b2a8221 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1c852c52-21d3-4d36-862f-c83a68bc1805 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882388.18875: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882387.5002337-12408-130850815313422/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882388.18883: _low_level_execute_command(): starting 11661 1726882388.18889: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882387.5002337-12408-130850815313422/ > /dev/null 2>&1 && sleep 0' 11661 1726882388.19336: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882388.19340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882388.19390: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882388.19393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882388.19396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882388.19456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882388.19461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882388.19468: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882388.19575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882388.22007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882388.22058: stderr chunk (state=3): >>><<< 11661 1726882388.22062: stdout chunk (state=3): >>><<< 11661 1726882388.22083: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882388.22089: handler run complete 11661 1726882388.22115: attempt loop complete, returning result 11661 1726882388.22120: _execute() done 11661 1726882388.22122: dumping result to json 11661 1726882388.22128: done dumping result, returning 11661 1726882388.22136: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-896b-2321-000000000036] 11661 1726882388.22141: sending task result for task 0e448fcc-3ce9-896b-2321-000000000036 11661 1726882388.22253: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000036 11661 1726882388.22256: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 6479a90a-8c97-4625-ab42-bedea857a927 [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 6d0f0ab8-7234-4420-a06e-101a4b2a8221 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1c852c52-21d3-4d36-862f-c83a68bc1805 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 6479a90a-8c97-4625-ab42-bedea857a927 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 6d0f0ab8-7234-4420-a06e-101a4b2a8221 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1c852c52-21d3-4d36-862f-c83a68bc1805 (not-active) 11661 1726882388.22382: no more pending results, returning what we have 11661 1726882388.22385: results queue empty 11661 1726882388.22386: checking for any_errors_fatal 11661 1726882388.22392: done checking for any_errors_fatal 11661 1726882388.22393: checking for max_fail_percentage 11661 1726882388.22394: done checking for max_fail_percentage 11661 1726882388.22395: checking to see if all hosts have failed and the running result is not ok 11661 1726882388.22396: done checking to see if all hosts have failed 11661 1726882388.22397: getting the remaining hosts for this loop 11661 1726882388.22398: done getting the remaining hosts for this loop 11661 1726882388.22402: getting the next task for host managed_node2 11661 1726882388.22409: done getting next task for host managed_node2 11661 1726882388.22412: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11661 1726882388.22415: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882388.22423: getting variables 11661 1726882388.22425: in VariableManager get_vars() 11661 1726882388.22472: Calling all_inventory to load vars for managed_node2 11661 1726882388.22475: Calling groups_inventory to load vars for managed_node2 11661 1726882388.22476: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882388.22485: Calling all_plugins_play to load vars for managed_node2 11661 1726882388.22487: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882388.22490: Calling groups_plugins_play to load vars for managed_node2 11661 1726882388.23319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882388.24782: done with get_vars() 11661 1726882388.24813: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:33:08 -0400 (0:00:00.854) 0:00:16.962 ****** 11661 1726882388.24907: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 11661 1726882388.24909: Creating lock for fedora.linux_system_roles.network_state 11661 1726882388.25236: worker is 1 (out of 1 available) 11661 1726882388.25250: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 11661 1726882388.25266: done queuing things up, now waiting for results queue to drain 11661 1726882388.25268: waiting for pending results... 11661 1726882388.25552: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 11661 1726882388.25701: in run() - task 0e448fcc-3ce9-896b-2321-000000000037 11661 1726882388.25726: variable 'ansible_search_path' from source: unknown 11661 1726882388.25734: variable 'ansible_search_path' from source: unknown 11661 1726882388.25779: calling self._execute() 11661 1726882388.25876: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882388.25887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882388.25908: variable 'omit' from source: magic vars 11661 1726882388.26229: variable 'ansible_distribution_major_version' from source: facts 11661 1726882388.26239: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882388.26334: variable 'network_state' from source: role '' defaults 11661 1726882388.26342: Evaluated conditional (network_state != {}): False 11661 1726882388.26345: when evaluation is False, skipping this task 11661 1726882388.26348: _execute() done 11661 1726882388.26353: dumping result to json 11661 1726882388.26356: done dumping result, returning 11661 1726882388.26361: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-896b-2321-000000000037] 11661 1726882388.26368: sending task result for task 0e448fcc-3ce9-896b-2321-000000000037 11661 1726882388.26460: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000037 11661 1726882388.26465: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11661 1726882388.26517: no more pending results, returning what we have 11661 1726882388.26520: results queue empty 11661 1726882388.26521: checking for any_errors_fatal 11661 1726882388.26531: done checking for any_errors_fatal 11661 1726882388.26532: checking for max_fail_percentage 11661 1726882388.26534: done checking for max_fail_percentage 11661 1726882388.26534: checking to see if all hosts have failed and the running result is not ok 11661 1726882388.26535: done checking to see if all hosts have failed 11661 1726882388.26536: getting the remaining hosts for this loop 11661 1726882388.26537: done getting the remaining hosts for this loop 11661 1726882388.26540: getting the next task for host managed_node2 11661 1726882388.26547: done getting next task for host managed_node2 11661 1726882388.26550: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11661 1726882388.26553: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882388.26571: getting variables 11661 1726882388.26572: in VariableManager get_vars() 11661 1726882388.26611: Calling all_inventory to load vars for managed_node2 11661 1726882388.26614: Calling groups_inventory to load vars for managed_node2 11661 1726882388.26617: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882388.26626: Calling all_plugins_play to load vars for managed_node2 11661 1726882388.26628: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882388.26630: Calling groups_plugins_play to load vars for managed_node2 11661 1726882388.28206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882388.29131: done with get_vars() 11661 1726882388.29147: done getting variables 11661 1726882388.29193: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:33:08 -0400 (0:00:00.043) 0:00:17.005 ****** 11661 1726882388.29215: entering _queue_task() for managed_node2/debug 11661 1726882388.29424: worker is 1 (out of 1 available) 11661 1726882388.29437: exiting _queue_task() for managed_node2/debug 11661 1726882388.29450: done queuing things up, now waiting for results queue to drain 11661 1726882388.29451: waiting for pending results... 11661 1726882388.29626: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11661 1726882388.29713: in run() - task 0e448fcc-3ce9-896b-2321-000000000038 11661 1726882388.29739: variable 'ansible_search_path' from source: unknown 11661 1726882388.29749: variable 'ansible_search_path' from source: unknown 11661 1726882388.29793: calling self._execute() 11661 1726882388.29891: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882388.29905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882388.29925: variable 'omit' from source: magic vars 11661 1726882388.30331: variable 'ansible_distribution_major_version' from source: facts 11661 1726882388.30357: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882388.30371: variable 'omit' from source: magic vars 11661 1726882388.30433: variable 'omit' from source: magic vars 11661 1726882388.30481: variable 'omit' from source: magic vars 11661 1726882388.30529: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882388.30575: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882388.30605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882388.30629: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882388.30646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882388.30687: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882388.30696: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882388.30709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882388.30820: Set connection var ansible_connection to ssh 11661 1726882388.30833: Set connection var ansible_pipelining to False 11661 1726882388.30843: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882388.30854: Set connection var ansible_timeout to 10 11661 1726882388.30859: Set connection var ansible_shell_type to sh 11661 1726882388.30874: Set connection var ansible_shell_executable to /bin/sh 11661 1726882388.30903: variable 'ansible_shell_executable' from source: unknown 11661 1726882388.30911: variable 'ansible_connection' from source: unknown 11661 1726882388.30924: variable 'ansible_module_compression' from source: unknown 11661 1726882388.30932: variable 'ansible_shell_type' from source: unknown 11661 1726882388.30940: variable 'ansible_shell_executable' from source: unknown 11661 1726882388.30947: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882388.30955: variable 'ansible_pipelining' from source: unknown 11661 1726882388.30962: variable 'ansible_timeout' from source: unknown 11661 1726882388.30975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882388.31102: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882388.31110: variable 'omit' from source: magic vars 11661 1726882388.31115: starting attempt loop 11661 1726882388.31118: running the handler 11661 1726882388.31213: variable '__network_connections_result' from source: set_fact 11661 1726882388.31261: handler run complete 11661 1726882388.31278: attempt loop complete, returning result 11661 1726882388.31289: _execute() done 11661 1726882388.31292: dumping result to json 11661 1726882388.31295: done dumping result, returning 11661 1726882388.31298: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-896b-2321-000000000038] 11661 1726882388.31304: sending task result for task 0e448fcc-3ce9-896b-2321-000000000038 11661 1726882388.31385: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000038 11661 1726882388.31387: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 6479a90a-8c97-4625-ab42-bedea857a927", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 6d0f0ab8-7234-4420-a06e-101a4b2a8221", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1c852c52-21d3-4d36-862f-c83a68bc1805", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 6479a90a-8c97-4625-ab42-bedea857a927 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 6d0f0ab8-7234-4420-a06e-101a4b2a8221 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1c852c52-21d3-4d36-862f-c83a68bc1805 (not-active)" ] } 11661 1726882388.31447: no more pending results, returning what we have 11661 1726882388.31450: results queue empty 11661 1726882388.31451: checking for any_errors_fatal 11661 1726882388.31457: done checking for any_errors_fatal 11661 1726882388.31458: checking for max_fail_percentage 11661 1726882388.31460: done checking for max_fail_percentage 11661 1726882388.31461: checking to see if all hosts have failed and the running result is not ok 11661 1726882388.31461: done checking to see if all hosts have failed 11661 1726882388.31462: getting the remaining hosts for this loop 11661 1726882388.31465: done getting the remaining hosts for this loop 11661 1726882388.31469: getting the next task for host managed_node2 11661 1726882388.31475: done getting next task for host managed_node2 11661 1726882388.31479: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11661 1726882388.31482: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882388.31492: getting variables 11661 1726882388.31493: in VariableManager get_vars() 11661 1726882388.31530: Calling all_inventory to load vars for managed_node2 11661 1726882388.31533: Calling groups_inventory to load vars for managed_node2 11661 1726882388.31535: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882388.31542: Calling all_plugins_play to load vars for managed_node2 11661 1726882388.31545: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882388.31547: Calling groups_plugins_play to load vars for managed_node2 11661 1726882388.32341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882388.33356: done with get_vars() 11661 1726882388.33372: done getting variables 11661 1726882388.33413: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:33:08 -0400 (0:00:00.042) 0:00:17.047 ****** 11661 1726882388.33440: entering _queue_task() for managed_node2/debug 11661 1726882388.33626: worker is 1 (out of 1 available) 11661 1726882388.33641: exiting _queue_task() for managed_node2/debug 11661 1726882388.33653: done queuing things up, now waiting for results queue to drain 11661 1726882388.33655: waiting for pending results... 11661 1726882388.33830: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11661 1726882388.33910: in run() - task 0e448fcc-3ce9-896b-2321-000000000039 11661 1726882388.33922: variable 'ansible_search_path' from source: unknown 11661 1726882388.33925: variable 'ansible_search_path' from source: unknown 11661 1726882388.33957: calling self._execute() 11661 1726882388.34022: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882388.34026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882388.34035: variable 'omit' from source: magic vars 11661 1726882388.34309: variable 'ansible_distribution_major_version' from source: facts 11661 1726882388.34318: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882388.34324: variable 'omit' from source: magic vars 11661 1726882388.34367: variable 'omit' from source: magic vars 11661 1726882388.34393: variable 'omit' from source: magic vars 11661 1726882388.34426: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882388.34451: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882388.34471: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882388.34485: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882388.34496: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882388.34518: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882388.34521: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882388.34523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882388.34592: Set connection var ansible_connection to ssh 11661 1726882388.34598: Set connection var ansible_pipelining to False 11661 1726882388.34602: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882388.34612: Set connection var ansible_timeout to 10 11661 1726882388.34615: Set connection var ansible_shell_type to sh 11661 1726882388.34621: Set connection var ansible_shell_executable to /bin/sh 11661 1726882388.34637: variable 'ansible_shell_executable' from source: unknown 11661 1726882388.34640: variable 'ansible_connection' from source: unknown 11661 1726882388.34642: variable 'ansible_module_compression' from source: unknown 11661 1726882388.34645: variable 'ansible_shell_type' from source: unknown 11661 1726882388.34647: variable 'ansible_shell_executable' from source: unknown 11661 1726882388.34649: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882388.34656: variable 'ansible_pipelining' from source: unknown 11661 1726882388.34659: variable 'ansible_timeout' from source: unknown 11661 1726882388.34665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882388.34768: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882388.34779: variable 'omit' from source: magic vars 11661 1726882388.34784: starting attempt loop 11661 1726882388.34787: running the handler 11661 1726882388.34826: variable '__network_connections_result' from source: set_fact 11661 1726882388.34882: variable '__network_connections_result' from source: set_fact 11661 1726882388.34991: handler run complete 11661 1726882388.35009: attempt loop complete, returning result 11661 1726882388.35012: _execute() done 11661 1726882388.35015: dumping result to json 11661 1726882388.35020: done dumping result, returning 11661 1726882388.35027: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-896b-2321-000000000039] 11661 1726882388.35032: sending task result for task 0e448fcc-3ce9-896b-2321-000000000039 11661 1726882388.35121: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000039 11661 1726882388.35125: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 6479a90a-8c97-4625-ab42-bedea857a927\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 6d0f0ab8-7234-4420-a06e-101a4b2a8221\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1c852c52-21d3-4d36-862f-c83a68bc1805\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 6479a90a-8c97-4625-ab42-bedea857a927 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 6d0f0ab8-7234-4420-a06e-101a4b2a8221 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1c852c52-21d3-4d36-862f-c83a68bc1805 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 6479a90a-8c97-4625-ab42-bedea857a927", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 6d0f0ab8-7234-4420-a06e-101a4b2a8221", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 1c852c52-21d3-4d36-862f-c83a68bc1805", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 6479a90a-8c97-4625-ab42-bedea857a927 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 6d0f0ab8-7234-4420-a06e-101a4b2a8221 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 1c852c52-21d3-4d36-862f-c83a68bc1805 (not-active)" ] } } 11661 1726882388.35230: no more pending results, returning what we have 11661 1726882388.35233: results queue empty 11661 1726882388.35238: checking for any_errors_fatal 11661 1726882388.35242: done checking for any_errors_fatal 11661 1726882388.35243: checking for max_fail_percentage 11661 1726882388.35245: done checking for max_fail_percentage 11661 1726882388.35245: checking to see if all hosts have failed and the running result is not ok 11661 1726882388.35246: done checking to see if all hosts have failed 11661 1726882388.35247: getting the remaining hosts for this loop 11661 1726882388.35248: done getting the remaining hosts for this loop 11661 1726882388.35259: getting the next task for host managed_node2 11661 1726882388.35266: done getting next task for host managed_node2 11661 1726882388.35269: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11661 1726882388.35271: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882388.35279: getting variables 11661 1726882388.35280: in VariableManager get_vars() 11661 1726882388.35305: Calling all_inventory to load vars for managed_node2 11661 1726882388.35306: Calling groups_inventory to load vars for managed_node2 11661 1726882388.35308: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882388.35313: Calling all_plugins_play to load vars for managed_node2 11661 1726882388.35315: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882388.35317: Calling groups_plugins_play to load vars for managed_node2 11661 1726882388.36102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882388.37037: done with get_vars() 11661 1726882388.37054: done getting variables 11661 1726882388.37098: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:33:08 -0400 (0:00:00.036) 0:00:17.084 ****** 11661 1726882388.37123: entering _queue_task() for managed_node2/debug 11661 1726882388.37318: worker is 1 (out of 1 available) 11661 1726882388.37331: exiting _queue_task() for managed_node2/debug 11661 1726882388.37342: done queuing things up, now waiting for results queue to drain 11661 1726882388.37344: waiting for pending results... 11661 1726882388.37515: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11661 1726882388.37591: in run() - task 0e448fcc-3ce9-896b-2321-00000000003a 11661 1726882388.37603: variable 'ansible_search_path' from source: unknown 11661 1726882388.37606: variable 'ansible_search_path' from source: unknown 11661 1726882388.37632: calling self._execute() 11661 1726882388.37700: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882388.37704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882388.37714: variable 'omit' from source: magic vars 11661 1726882388.37973: variable 'ansible_distribution_major_version' from source: facts 11661 1726882388.37983: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882388.38066: variable 'network_state' from source: role '' defaults 11661 1726882388.38073: Evaluated conditional (network_state != {}): False 11661 1726882388.38076: when evaluation is False, skipping this task 11661 1726882388.38079: _execute() done 11661 1726882388.38081: dumping result to json 11661 1726882388.38089: done dumping result, returning 11661 1726882388.38092: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-896b-2321-00000000003a] 11661 1726882388.38097: sending task result for task 0e448fcc-3ce9-896b-2321-00000000003a 11661 1726882388.38185: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000003a 11661 1726882388.38188: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 11661 1726882388.38258: no more pending results, returning what we have 11661 1726882388.38262: results queue empty 11661 1726882388.38263: checking for any_errors_fatal 11661 1726882388.38270: done checking for any_errors_fatal 11661 1726882388.38271: checking for max_fail_percentage 11661 1726882388.38273: done checking for max_fail_percentage 11661 1726882388.38273: checking to see if all hosts have failed and the running result is not ok 11661 1726882388.38274: done checking to see if all hosts have failed 11661 1726882388.38274: getting the remaining hosts for this loop 11661 1726882388.38276: done getting the remaining hosts for this loop 11661 1726882388.38278: getting the next task for host managed_node2 11661 1726882388.38283: done getting next task for host managed_node2 11661 1726882388.38286: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11661 1726882388.38289: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882388.38305: getting variables 11661 1726882388.38306: in VariableManager get_vars() 11661 1726882388.38337: Calling all_inventory to load vars for managed_node2 11661 1726882388.38339: Calling groups_inventory to load vars for managed_node2 11661 1726882388.38340: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882388.38346: Calling all_plugins_play to load vars for managed_node2 11661 1726882388.38348: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882388.38350: Calling groups_plugins_play to load vars for managed_node2 11661 1726882388.39204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882388.40125: done with get_vars() 11661 1726882388.40140: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:33:08 -0400 (0:00:00.030) 0:00:17.115 ****** 11661 1726882388.40209: entering _queue_task() for managed_node2/ping 11661 1726882388.40210: Creating lock for ping 11661 1726882388.40416: worker is 1 (out of 1 available) 11661 1726882388.40430: exiting _queue_task() for managed_node2/ping 11661 1726882388.40440: done queuing things up, now waiting for results queue to drain 11661 1726882388.40442: waiting for pending results... 11661 1726882388.40614: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 11661 1726882388.40707: in run() - task 0e448fcc-3ce9-896b-2321-00000000003b 11661 1726882388.40722: variable 'ansible_search_path' from source: unknown 11661 1726882388.40725: variable 'ansible_search_path' from source: unknown 11661 1726882388.40756: calling self._execute() 11661 1726882388.40823: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882388.40826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882388.40835: variable 'omit' from source: magic vars 11661 1726882388.41103: variable 'ansible_distribution_major_version' from source: facts 11661 1726882388.41113: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882388.41118: variable 'omit' from source: magic vars 11661 1726882388.41159: variable 'omit' from source: magic vars 11661 1726882388.41183: variable 'omit' from source: magic vars 11661 1726882388.41216: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882388.41245: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882388.41262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882388.41277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882388.41286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882388.41309: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882388.41312: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882388.41314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882388.41386: Set connection var ansible_connection to ssh 11661 1726882388.41389: Set connection var ansible_pipelining to False 11661 1726882388.41395: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882388.41401: Set connection var ansible_timeout to 10 11661 1726882388.41404: Set connection var ansible_shell_type to sh 11661 1726882388.41410: Set connection var ansible_shell_executable to /bin/sh 11661 1726882388.41426: variable 'ansible_shell_executable' from source: unknown 11661 1726882388.41429: variable 'ansible_connection' from source: unknown 11661 1726882388.41432: variable 'ansible_module_compression' from source: unknown 11661 1726882388.41436: variable 'ansible_shell_type' from source: unknown 11661 1726882388.41438: variable 'ansible_shell_executable' from source: unknown 11661 1726882388.41441: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882388.41443: variable 'ansible_pipelining' from source: unknown 11661 1726882388.41445: variable 'ansible_timeout' from source: unknown 11661 1726882388.41447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882388.41591: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11661 1726882388.41600: variable 'omit' from source: magic vars 11661 1726882388.41603: starting attempt loop 11661 1726882388.41606: running the handler 11661 1726882388.41617: _low_level_execute_command(): starting 11661 1726882388.41624: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882388.42141: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882388.42160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882388.42175: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882388.42188: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 11661 1726882388.42199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882388.42244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882388.42255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882388.42270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882388.42382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882388.44095: stdout chunk (state=3): >>>/root <<< 11661 1726882388.44193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882388.44245: stderr chunk (state=3): >>><<< 11661 1726882388.44263: stdout chunk (state=3): >>><<< 11661 1726882388.44284: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882388.44296: _low_level_execute_command(): starting 11661 1726882388.44301: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882388.442835-12443-82609226525002 `" && echo ansible-tmp-1726882388.442835-12443-82609226525002="` echo /root/.ansible/tmp/ansible-tmp-1726882388.442835-12443-82609226525002 `" ) && sleep 0' 11661 1726882388.44767: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882388.44772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882388.44816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882388.44826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882388.44829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882388.44866: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882388.44876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882388.44880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882388.44994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882388.46952: stdout chunk (state=3): >>>ansible-tmp-1726882388.442835-12443-82609226525002=/root/.ansible/tmp/ansible-tmp-1726882388.442835-12443-82609226525002 <<< 11661 1726882388.47063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882388.47121: stderr chunk (state=3): >>><<< 11661 1726882388.47125: stdout chunk (state=3): >>><<< 11661 1726882388.47140: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882388.442835-12443-82609226525002=/root/.ansible/tmp/ansible-tmp-1726882388.442835-12443-82609226525002 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882388.47184: variable 'ansible_module_compression' from source: unknown 11661 1726882388.47219: ANSIBALLZ: Using lock for ping 11661 1726882388.47222: ANSIBALLZ: Acquiring lock 11661 1726882388.47225: ANSIBALLZ: Lock acquired: 139652576510416 11661 1726882388.47227: ANSIBALLZ: Creating module 11661 1726882388.54947: ANSIBALLZ: Writing module into payload 11661 1726882388.54991: ANSIBALLZ: Writing module 11661 1726882388.55013: ANSIBALLZ: Renaming module 11661 1726882388.55017: ANSIBALLZ: Done creating module 11661 1726882388.55030: variable 'ansible_facts' from source: unknown 11661 1726882388.55076: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882388.442835-12443-82609226525002/AnsiballZ_ping.py 11661 1726882388.55189: Sending initial data 11661 1726882388.55193: Sent initial data (151 bytes) 11661 1726882388.55901: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882388.55909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882388.55937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882388.55949: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882388.56001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882388.56013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882388.56023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882388.56130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882388.57971: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882388.58070: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882388.58177: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmp6v4o1f63 /root/.ansible/tmp/ansible-tmp-1726882388.442835-12443-82609226525002/AnsiballZ_ping.py <<< 11661 1726882388.58266: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882388.59267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882388.59366: stderr chunk (state=3): >>><<< 11661 1726882388.59371: stdout chunk (state=3): >>><<< 11661 1726882388.59386: done transferring module to remote 11661 1726882388.59396: _low_level_execute_command(): starting 11661 1726882388.59400: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882388.442835-12443-82609226525002/ /root/.ansible/tmp/ansible-tmp-1726882388.442835-12443-82609226525002/AnsiballZ_ping.py && sleep 0' 11661 1726882388.59847: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882388.59858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882388.59887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882388.59899: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882388.59947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882388.59968: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882388.60076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882388.61867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882388.61913: stderr chunk (state=3): >>><<< 11661 1726882388.61916: stdout chunk (state=3): >>><<< 11661 1726882388.61932: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882388.61935: _low_level_execute_command(): starting 11661 1726882388.61944: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882388.442835-12443-82609226525002/AnsiballZ_ping.py && sleep 0' 11661 1726882388.62399: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882388.62402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882388.62442: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882388.62445: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882388.62447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882388.62497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882388.62501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882388.62614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882388.75591: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11661 1726882388.76663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882388.76718: stderr chunk (state=3): >>><<< 11661 1726882388.76722: stdout chunk (state=3): >>><<< 11661 1726882388.76740: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882388.76760: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882388.442835-12443-82609226525002/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882388.76769: _low_level_execute_command(): starting 11661 1726882388.76773: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882388.442835-12443-82609226525002/ > /dev/null 2>&1 && sleep 0' 11661 1726882388.77474: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882388.77478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882388.77512: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882388.77524: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882388.77581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882388.77592: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882388.77700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882388.79534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882388.79602: stderr chunk (state=3): >>><<< 11661 1726882388.79605: stdout chunk (state=3): >>><<< 11661 1726882388.79622: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882388.79636: handler run complete 11661 1726882388.79655: attempt loop complete, returning result 11661 1726882388.79658: _execute() done 11661 1726882388.79661: dumping result to json 11661 1726882388.79665: done dumping result, returning 11661 1726882388.79672: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-896b-2321-00000000003b] 11661 1726882388.79677: sending task result for task 0e448fcc-3ce9-896b-2321-00000000003b 11661 1726882388.79775: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000003b 11661 1726882388.79777: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 11661 1726882388.79867: no more pending results, returning what we have 11661 1726882388.79871: results queue empty 11661 1726882388.79871: checking for any_errors_fatal 11661 1726882388.79876: done checking for any_errors_fatal 11661 1726882388.79876: checking for max_fail_percentage 11661 1726882388.79878: done checking for max_fail_percentage 11661 1726882388.79879: checking to see if all hosts have failed and the running result is not ok 11661 1726882388.79880: done checking to see if all hosts have failed 11661 1726882388.79881: getting the remaining hosts for this loop 11661 1726882388.79883: done getting the remaining hosts for this loop 11661 1726882388.79886: getting the next task for host managed_node2 11661 1726882388.79895: done getting next task for host managed_node2 11661 1726882388.79898: ^ task is: TASK: meta (role_complete) 11661 1726882388.79901: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882388.79911: getting variables 11661 1726882388.79913: in VariableManager get_vars() 11661 1726882388.79956: Calling all_inventory to load vars for managed_node2 11661 1726882388.79959: Calling groups_inventory to load vars for managed_node2 11661 1726882388.79961: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882388.79974: Calling all_plugins_play to load vars for managed_node2 11661 1726882388.79977: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882388.79981: Calling groups_plugins_play to load vars for managed_node2 11661 1726882388.81729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882388.82788: done with get_vars() 11661 1726882388.82807: done getting variables 11661 1726882388.82868: done queuing things up, now waiting for results queue to drain 11661 1726882388.82870: results queue empty 11661 1726882388.82870: checking for any_errors_fatal 11661 1726882388.82873: done checking for any_errors_fatal 11661 1726882388.82873: checking for max_fail_percentage 11661 1726882388.82874: done checking for max_fail_percentage 11661 1726882388.82874: checking to see if all hosts have failed and the running result is not ok 11661 1726882388.82875: done checking to see if all hosts have failed 11661 1726882388.82875: getting the remaining hosts for this loop 11661 1726882388.82876: done getting the remaining hosts for this loop 11661 1726882388.82878: getting the next task for host managed_node2 11661 1726882388.82881: done getting next task for host managed_node2 11661 1726882388.82883: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11661 1726882388.82884: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882388.82886: getting variables 11661 1726882388.82886: in VariableManager get_vars() 11661 1726882388.82898: Calling all_inventory to load vars for managed_node2 11661 1726882388.82899: Calling groups_inventory to load vars for managed_node2 11661 1726882388.82901: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882388.82905: Calling all_plugins_play to load vars for managed_node2 11661 1726882388.82906: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882388.82908: Calling groups_plugins_play to load vars for managed_node2 11661 1726882388.83644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882388.85245: done with get_vars() 11661 1726882388.85272: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:33:08 -0400 (0:00:00.451) 0:00:17.566 ****** 11661 1726882388.85349: entering _queue_task() for managed_node2/include_tasks 11661 1726882388.85721: worker is 1 (out of 1 available) 11661 1726882388.85736: exiting _queue_task() for managed_node2/include_tasks 11661 1726882388.85747: done queuing things up, now waiting for results queue to drain 11661 1726882388.85749: waiting for pending results... 11661 1726882388.86031: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 11661 1726882388.86160: in run() - task 0e448fcc-3ce9-896b-2321-00000000006e 11661 1726882388.86182: variable 'ansible_search_path' from source: unknown 11661 1726882388.86191: variable 'ansible_search_path' from source: unknown 11661 1726882388.86231: calling self._execute() 11661 1726882388.86330: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882388.86342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882388.86359: variable 'omit' from source: magic vars 11661 1726882388.86757: variable 'ansible_distribution_major_version' from source: facts 11661 1726882388.86777: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882388.86788: _execute() done 11661 1726882388.86795: dumping result to json 11661 1726882388.86803: done dumping result, returning 11661 1726882388.86812: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-896b-2321-00000000006e] 11661 1726882388.86822: sending task result for task 0e448fcc-3ce9-896b-2321-00000000006e 11661 1726882388.86934: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000006e 11661 1726882388.86941: WORKER PROCESS EXITING 11661 1726882388.86981: no more pending results, returning what we have 11661 1726882388.86987: in VariableManager get_vars() 11661 1726882388.87034: Calling all_inventory to load vars for managed_node2 11661 1726882388.87037: Calling groups_inventory to load vars for managed_node2 11661 1726882388.87039: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882388.87055: Calling all_plugins_play to load vars for managed_node2 11661 1726882388.87059: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882388.87064: Calling groups_plugins_play to load vars for managed_node2 11661 1726882388.88755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882388.90602: done with get_vars() 11661 1726882388.90622: variable 'ansible_search_path' from source: unknown 11661 1726882388.90623: variable 'ansible_search_path' from source: unknown 11661 1726882388.90669: we have included files to process 11661 1726882388.90670: generating all_blocks data 11661 1726882388.90672: done generating all_blocks data 11661 1726882388.90678: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11661 1726882388.90679: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11661 1726882388.90681: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11661 1726882388.90878: done processing included file 11661 1726882388.90880: iterating over new_blocks loaded from include file 11661 1726882388.90882: in VariableManager get_vars() 11661 1726882388.90902: done with get_vars() 11661 1726882388.90904: filtering new block on tags 11661 1726882388.90922: done filtering new block on tags 11661 1726882388.90924: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 11661 1726882388.90929: extending task lists for all hosts with included blocks 11661 1726882388.91028: done extending task lists 11661 1726882388.91030: done processing included files 11661 1726882388.91030: results queue empty 11661 1726882388.91031: checking for any_errors_fatal 11661 1726882388.91033: done checking for any_errors_fatal 11661 1726882388.91034: checking for max_fail_percentage 11661 1726882388.91035: done checking for max_fail_percentage 11661 1726882388.91035: checking to see if all hosts have failed and the running result is not ok 11661 1726882388.91036: done checking to see if all hosts have failed 11661 1726882388.91037: getting the remaining hosts for this loop 11661 1726882388.91038: done getting the remaining hosts for this loop 11661 1726882388.91040: getting the next task for host managed_node2 11661 1726882388.91044: done getting next task for host managed_node2 11661 1726882388.91046: ^ task is: TASK: Get stat for interface {{ interface }} 11661 1726882388.91049: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882388.91055: getting variables 11661 1726882388.91056: in VariableManager get_vars() 11661 1726882388.91072: Calling all_inventory to load vars for managed_node2 11661 1726882388.91074: Calling groups_inventory to load vars for managed_node2 11661 1726882388.91076: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882388.91082: Calling all_plugins_play to load vars for managed_node2 11661 1726882388.91084: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882388.91086: Calling groups_plugins_play to load vars for managed_node2 11661 1726882388.92335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882388.94025: done with get_vars() 11661 1726882388.94052: done getting variables 11661 1726882388.94218: variable 'interface' from source: task vars 11661 1726882388.94221: variable 'controller_device' from source: play vars 11661 1726882388.94284: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:33:08 -0400 (0:00:00.089) 0:00:17.656 ****** 11661 1726882388.94316: entering _queue_task() for managed_node2/stat 11661 1726882388.94640: worker is 1 (out of 1 available) 11661 1726882388.94654: exiting _queue_task() for managed_node2/stat 11661 1726882388.94669: done queuing things up, now waiting for results queue to drain 11661 1726882388.94671: waiting for pending results... 11661 1726882388.94983: running TaskExecutor() for managed_node2/TASK: Get stat for interface nm-bond 11661 1726882388.95124: in run() - task 0e448fcc-3ce9-896b-2321-000000000241 11661 1726882388.95146: variable 'ansible_search_path' from source: unknown 11661 1726882388.95157: variable 'ansible_search_path' from source: unknown 11661 1726882388.95198: calling self._execute() 11661 1726882388.95298: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882388.95309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882388.95325: variable 'omit' from source: magic vars 11661 1726882388.95707: variable 'ansible_distribution_major_version' from source: facts 11661 1726882388.95724: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882388.95735: variable 'omit' from source: magic vars 11661 1726882388.95806: variable 'omit' from source: magic vars 11661 1726882388.95912: variable 'interface' from source: task vars 11661 1726882388.95922: variable 'controller_device' from source: play vars 11661 1726882388.95997: variable 'controller_device' from source: play vars 11661 1726882388.96020: variable 'omit' from source: magic vars 11661 1726882388.96073: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882388.96118: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882388.96143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882388.96170: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882388.96188: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882388.96228: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882388.96237: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882388.96245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882388.96359: Set connection var ansible_connection to ssh 11661 1726882388.96374: Set connection var ansible_pipelining to False 11661 1726882388.96383: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882388.96391: Set connection var ansible_timeout to 10 11661 1726882388.96396: Set connection var ansible_shell_type to sh 11661 1726882388.96404: Set connection var ansible_shell_executable to /bin/sh 11661 1726882388.96432: variable 'ansible_shell_executable' from source: unknown 11661 1726882388.96440: variable 'ansible_connection' from source: unknown 11661 1726882388.96447: variable 'ansible_module_compression' from source: unknown 11661 1726882388.96458: variable 'ansible_shell_type' from source: unknown 11661 1726882388.96468: variable 'ansible_shell_executable' from source: unknown 11661 1726882388.96475: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882388.96482: variable 'ansible_pipelining' from source: unknown 11661 1726882388.96490: variable 'ansible_timeout' from source: unknown 11661 1726882388.96497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882388.96704: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11661 1726882388.96717: variable 'omit' from source: magic vars 11661 1726882388.96725: starting attempt loop 11661 1726882388.96730: running the handler 11661 1726882388.96747: _low_level_execute_command(): starting 11661 1726882388.96768: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882388.98173: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882388.98291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882388.98308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882388.98328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882388.98395: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882388.98514: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882388.98529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882388.98549: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882388.98570: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882388.98584: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882388.98598: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882388.98617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882388.98634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882388.98648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882388.98668: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882388.98683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882388.98769: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882388.98848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882388.98870: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882388.99149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882389.00737: stdout chunk (state=3): >>>/root <<< 11661 1726882389.00933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882389.00937: stdout chunk (state=3): >>><<< 11661 1726882389.00939: stderr chunk (state=3): >>><<< 11661 1726882389.01068: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882389.01072: _low_level_execute_command(): starting 11661 1726882389.01075: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882389.0096548-12453-269609256162589 `" && echo ansible-tmp-1726882389.0096548-12453-269609256162589="` echo /root/.ansible/tmp/ansible-tmp-1726882389.0096548-12453-269609256162589 `" ) && sleep 0' 11661 1726882389.01679: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882389.01692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882389.01707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882389.01727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882389.01775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882389.01787: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882389.01799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882389.01815: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882389.01830: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882389.01840: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882389.01854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882389.01869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882389.01884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882389.01895: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882389.01905: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882389.01916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882389.02001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882389.02016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882389.02030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882389.02161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882389.04075: stdout chunk (state=3): >>>ansible-tmp-1726882389.0096548-12453-269609256162589=/root/.ansible/tmp/ansible-tmp-1726882389.0096548-12453-269609256162589 <<< 11661 1726882389.04259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882389.04264: stdout chunk (state=3): >>><<< 11661 1726882389.04267: stderr chunk (state=3): >>><<< 11661 1726882389.04473: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882389.0096548-12453-269609256162589=/root/.ansible/tmp/ansible-tmp-1726882389.0096548-12453-269609256162589 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882389.04477: variable 'ansible_module_compression' from source: unknown 11661 1726882389.04480: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11661 1726882389.04482: variable 'ansible_facts' from source: unknown 11661 1726882389.04537: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882389.0096548-12453-269609256162589/AnsiballZ_stat.py 11661 1726882389.04699: Sending initial data 11661 1726882389.04702: Sent initial data (153 bytes) 11661 1726882389.05702: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882389.05715: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882389.05728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882389.05744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882389.05788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882389.05804: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882389.05816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882389.05832: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882389.05843: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882389.05853: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882389.05867: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882389.05880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882389.05894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882389.05910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882389.05920: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882389.05932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882389.06009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882389.06036: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882389.06050: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882389.06184: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882389.07992: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882389.08090: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882389.08192: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpza4kt2oe /root/.ansible/tmp/ansible-tmp-1726882389.0096548-12453-269609256162589/AnsiballZ_stat.py <<< 11661 1726882389.08292: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882389.09736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882389.09867: stderr chunk (state=3): >>><<< 11661 1726882389.09870: stdout chunk (state=3): >>><<< 11661 1726882389.09873: done transferring module to remote 11661 1726882389.09879: _low_level_execute_command(): starting 11661 1726882389.09881: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882389.0096548-12453-269609256162589/ /root/.ansible/tmp/ansible-tmp-1726882389.0096548-12453-269609256162589/AnsiballZ_stat.py && sleep 0' 11661 1726882389.10477: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882389.10490: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882389.10504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882389.10527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882389.10574: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882389.10586: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882389.10599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882389.10615: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882389.10632: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882389.10642: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882389.10654: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882389.10672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882389.10687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882389.10699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882389.10708: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882389.10723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882389.10806: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882389.10822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882389.10841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882389.10977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882389.12780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882389.12845: stderr chunk (state=3): >>><<< 11661 1726882389.12849: stdout chunk (state=3): >>><<< 11661 1726882389.12945: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882389.12949: _low_level_execute_command(): starting 11661 1726882389.12951: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882389.0096548-12453-269609256162589/AnsiballZ_stat.py && sleep 0' 11661 1726882389.13559: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882389.13574: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882389.13586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882389.13604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882389.13649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882389.13661: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882389.13677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882389.13695: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882389.13705: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882389.13721: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882389.13735: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882389.13746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882389.13760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882389.13772: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882389.13781: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882389.13792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882389.13878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882389.13898: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882389.13911: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882389.14048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882389.27296: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26266, "dev": 21, "nlink": 1, "atime": 1726882388.0013497, "mtime": 1726882388.0013497, "ctime": 1726882388.0013497, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11661 1726882389.28284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882389.28404: stderr chunk (state=3): >>><<< 11661 1726882389.28407: stdout chunk (state=3): >>><<< 11661 1726882389.28565: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26266, "dev": 21, "nlink": 1, "atime": 1726882388.0013497, "mtime": 1726882388.0013497, "ctime": 1726882388.0013497, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882389.28577: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882389.0096548-12453-269609256162589/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882389.28579: _low_level_execute_command(): starting 11661 1726882389.28582: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882389.0096548-12453-269609256162589/ > /dev/null 2>&1 && sleep 0' 11661 1726882389.29482: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 11661 1726882389.30086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882389.30092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882389.30198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882389.32006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882389.32080: stderr chunk (state=3): >>><<< 11661 1726882389.32084: stdout chunk (state=3): >>><<< 11661 1726882389.32394: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882389.32397: handler run complete 11661 1726882389.32400: attempt loop complete, returning result 11661 1726882389.32402: _execute() done 11661 1726882389.32404: dumping result to json 11661 1726882389.32406: done dumping result, returning 11661 1726882389.32409: done running TaskExecutor() for managed_node2/TASK: Get stat for interface nm-bond [0e448fcc-3ce9-896b-2321-000000000241] 11661 1726882389.32411: sending task result for task 0e448fcc-3ce9-896b-2321-000000000241 11661 1726882389.32495: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000241 11661 1726882389.32499: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726882388.0013497, "block_size": 4096, "blocks": 0, "ctime": 1726882388.0013497, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 26266, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1726882388.0013497, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11661 1726882389.32602: no more pending results, returning what we have 11661 1726882389.32606: results queue empty 11661 1726882389.32607: checking for any_errors_fatal 11661 1726882389.32608: done checking for any_errors_fatal 11661 1726882389.32609: checking for max_fail_percentage 11661 1726882389.32612: done checking for max_fail_percentage 11661 1726882389.32613: checking to see if all hosts have failed and the running result is not ok 11661 1726882389.32614: done checking to see if all hosts have failed 11661 1726882389.32614: getting the remaining hosts for this loop 11661 1726882389.32616: done getting the remaining hosts for this loop 11661 1726882389.32620: getting the next task for host managed_node2 11661 1726882389.32629: done getting next task for host managed_node2 11661 1726882389.32632: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11661 1726882389.32635: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882389.32641: getting variables 11661 1726882389.32643: in VariableManager get_vars() 11661 1726882389.32690: Calling all_inventory to load vars for managed_node2 11661 1726882389.32694: Calling groups_inventory to load vars for managed_node2 11661 1726882389.32696: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882389.32708: Calling all_plugins_play to load vars for managed_node2 11661 1726882389.32712: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882389.32715: Calling groups_plugins_play to load vars for managed_node2 11661 1726882389.35903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882389.37645: done with get_vars() 11661 1726882389.37680: done getting variables 11661 1726882389.37748: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882389.37877: variable 'interface' from source: task vars 11661 1726882389.37881: variable 'controller_device' from source: play vars 11661 1726882389.37943: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:33:09 -0400 (0:00:00.436) 0:00:18.093 ****** 11661 1726882389.37979: entering _queue_task() for managed_node2/assert 11661 1726882389.38316: worker is 1 (out of 1 available) 11661 1726882389.38327: exiting _queue_task() for managed_node2/assert 11661 1726882389.38345: done queuing things up, now waiting for results queue to drain 11661 1726882389.38347: waiting for pending results... 11661 1726882389.38629: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'nm-bond' 11661 1726882389.38755: in run() - task 0e448fcc-3ce9-896b-2321-00000000006f 11661 1726882389.38778: variable 'ansible_search_path' from source: unknown 11661 1726882389.38797: variable 'ansible_search_path' from source: unknown 11661 1726882389.38838: calling self._execute() 11661 1726882389.38940: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882389.38952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882389.38968: variable 'omit' from source: magic vars 11661 1726882389.39344: variable 'ansible_distribution_major_version' from source: facts 11661 1726882389.39362: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882389.39375: variable 'omit' from source: magic vars 11661 1726882389.39424: variable 'omit' from source: magic vars 11661 1726882389.39529: variable 'interface' from source: task vars 11661 1726882389.39538: variable 'controller_device' from source: play vars 11661 1726882389.39611: variable 'controller_device' from source: play vars 11661 1726882389.39634: variable 'omit' from source: magic vars 11661 1726882389.39690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882389.39729: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882389.39755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882389.39786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882389.39803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882389.39836: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882389.39845: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882389.39853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882389.39966: Set connection var ansible_connection to ssh 11661 1726882389.39983: Set connection var ansible_pipelining to False 11661 1726882389.39998: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882389.40012: Set connection var ansible_timeout to 10 11661 1726882389.40019: Set connection var ansible_shell_type to sh 11661 1726882389.40031: Set connection var ansible_shell_executable to /bin/sh 11661 1726882389.40058: variable 'ansible_shell_executable' from source: unknown 11661 1726882389.40069: variable 'ansible_connection' from source: unknown 11661 1726882389.40077: variable 'ansible_module_compression' from source: unknown 11661 1726882389.40085: variable 'ansible_shell_type' from source: unknown 11661 1726882389.40097: variable 'ansible_shell_executable' from source: unknown 11661 1726882389.40107: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882389.40115: variable 'ansible_pipelining' from source: unknown 11661 1726882389.40123: variable 'ansible_timeout' from source: unknown 11661 1726882389.40132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882389.40279: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882389.40296: variable 'omit' from source: magic vars 11661 1726882389.40310: starting attempt loop 11661 1726882389.40320: running the handler 11661 1726882389.40463: variable 'interface_stat' from source: set_fact 11661 1726882389.40491: Evaluated conditional (interface_stat.stat.exists): True 11661 1726882389.40501: handler run complete 11661 1726882389.40519: attempt loop complete, returning result 11661 1726882389.40533: _execute() done 11661 1726882389.40541: dumping result to json 11661 1726882389.40549: done dumping result, returning 11661 1726882389.40560: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'nm-bond' [0e448fcc-3ce9-896b-2321-00000000006f] 11661 1726882389.40573: sending task result for task 0e448fcc-3ce9-896b-2321-00000000006f ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11661 1726882389.40722: no more pending results, returning what we have 11661 1726882389.40726: results queue empty 11661 1726882389.40727: checking for any_errors_fatal 11661 1726882389.40737: done checking for any_errors_fatal 11661 1726882389.40738: checking for max_fail_percentage 11661 1726882389.40740: done checking for max_fail_percentage 11661 1726882389.40740: checking to see if all hosts have failed and the running result is not ok 11661 1726882389.40741: done checking to see if all hosts have failed 11661 1726882389.40742: getting the remaining hosts for this loop 11661 1726882389.40744: done getting the remaining hosts for this loop 11661 1726882389.40748: getting the next task for host managed_node2 11661 1726882389.40757: done getting next task for host managed_node2 11661 1726882389.40761: ^ task is: TASK: Include the task 'assert_profile_present.yml' 11661 1726882389.40764: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882389.40768: getting variables 11661 1726882389.40770: in VariableManager get_vars() 11661 1726882389.40814: Calling all_inventory to load vars for managed_node2 11661 1726882389.40816: Calling groups_inventory to load vars for managed_node2 11661 1726882389.40819: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882389.40830: Calling all_plugins_play to load vars for managed_node2 11661 1726882389.40833: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882389.40837: Calling groups_plugins_play to load vars for managed_node2 11661 1726882389.41904: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000006f 11661 1726882389.41907: WORKER PROCESS EXITING 11661 1726882389.43029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882389.45213: done with get_vars() 11661 1726882389.45240: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:67 Friday 20 September 2024 21:33:09 -0400 (0:00:00.073) 0:00:18.166 ****** 11661 1726882389.45349: entering _queue_task() for managed_node2/include_tasks 11661 1726882389.45835: worker is 1 (out of 1 available) 11661 1726882389.45848: exiting _queue_task() for managed_node2/include_tasks 11661 1726882389.45862: done queuing things up, now waiting for results queue to drain 11661 1726882389.45866: waiting for pending results... 11661 1726882389.46827: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_present.yml' 11661 1726882389.47105: in run() - task 0e448fcc-3ce9-896b-2321-000000000070 11661 1726882389.47131: variable 'ansible_search_path' from source: unknown 11661 1726882389.47190: variable 'controller_profile' from source: play vars 11661 1726882389.47401: variable 'controller_profile' from source: play vars 11661 1726882389.47426: variable 'port1_profile' from source: play vars 11661 1726882389.47504: variable 'port1_profile' from source: play vars 11661 1726882389.47518: variable 'port2_profile' from source: play vars 11661 1726882389.47593: variable 'port2_profile' from source: play vars 11661 1726882389.47611: variable 'omit' from source: magic vars 11661 1726882389.47765: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882389.47785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882389.47802: variable 'omit' from source: magic vars 11661 1726882389.48056: variable 'ansible_distribution_major_version' from source: facts 11661 1726882389.48079: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882389.48117: variable 'item' from source: unknown 11661 1726882389.48185: variable 'item' from source: unknown 11661 1726882389.48383: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882389.48397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882389.48412: variable 'omit' from source: magic vars 11661 1726882389.48577: variable 'ansible_distribution_major_version' from source: facts 11661 1726882389.48587: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882389.48621: variable 'item' from source: unknown 11661 1726882389.48690: variable 'item' from source: unknown 11661 1726882389.48830: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882389.48847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882389.48862: variable 'omit' from source: magic vars 11661 1726882389.49062: variable 'ansible_distribution_major_version' from source: facts 11661 1726882389.49596: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882389.49624: variable 'item' from source: unknown 11661 1726882389.49696: variable 'item' from source: unknown 11661 1726882389.49771: dumping result to json 11661 1726882389.49782: done dumping result, returning 11661 1726882389.49793: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_present.yml' [0e448fcc-3ce9-896b-2321-000000000070] 11661 1726882389.49806: sending task result for task 0e448fcc-3ce9-896b-2321-000000000070 11661 1726882389.49875: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000070 11661 1726882389.49882: WORKER PROCESS EXITING 11661 1726882389.49936: no more pending results, returning what we have 11661 1726882389.49941: in VariableManager get_vars() 11661 1726882389.49989: Calling all_inventory to load vars for managed_node2 11661 1726882389.49992: Calling groups_inventory to load vars for managed_node2 11661 1726882389.49994: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882389.50008: Calling all_plugins_play to load vars for managed_node2 11661 1726882389.50013: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882389.50017: Calling groups_plugins_play to load vars for managed_node2 11661 1726882389.57985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882389.60541: done with get_vars() 11661 1726882389.60570: variable 'ansible_search_path' from source: unknown 11661 1726882389.60586: variable 'ansible_search_path' from source: unknown 11661 1726882389.60594: variable 'ansible_search_path' from source: unknown 11661 1726882389.60599: we have included files to process 11661 1726882389.60600: generating all_blocks data 11661 1726882389.60601: done generating all_blocks data 11661 1726882389.60603: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11661 1726882389.60604: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11661 1726882389.60606: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11661 1726882389.60785: in VariableManager get_vars() 11661 1726882389.60808: done with get_vars() 11661 1726882389.61066: done processing included file 11661 1726882389.61068: iterating over new_blocks loaded from include file 11661 1726882389.61070: in VariableManager get_vars() 11661 1726882389.61088: done with get_vars() 11661 1726882389.61090: filtering new block on tags 11661 1726882389.61112: done filtering new block on tags 11661 1726882389.61114: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=bond0) 11661 1726882389.61120: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11661 1726882389.61121: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11661 1726882389.61124: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11661 1726882389.61235: in VariableManager get_vars() 11661 1726882389.61253: done with get_vars() 11661 1726882389.61504: done processing included file 11661 1726882389.61506: iterating over new_blocks loaded from include file 11661 1726882389.61508: in VariableManager get_vars() 11661 1726882389.61524: done with get_vars() 11661 1726882389.61525: filtering new block on tags 11661 1726882389.61544: done filtering new block on tags 11661 1726882389.61546: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=bond0.0) 11661 1726882389.61549: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11661 1726882389.61550: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11661 1726882389.61553: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11661 1726882389.61775: in VariableManager get_vars() 11661 1726882389.61797: done with get_vars() 11661 1726882389.62244: done processing included file 11661 1726882389.62245: iterating over new_blocks loaded from include file 11661 1726882389.62247: in VariableManager get_vars() 11661 1726882389.62378: done with get_vars() 11661 1726882389.62381: filtering new block on tags 11661 1726882389.62399: done filtering new block on tags 11661 1726882389.62401: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 => (item=bond0.1) 11661 1726882389.62405: extending task lists for all hosts with included blocks 11661 1726882389.65418: done extending task lists 11661 1726882389.65425: done processing included files 11661 1726882389.65426: results queue empty 11661 1726882389.65427: checking for any_errors_fatal 11661 1726882389.65430: done checking for any_errors_fatal 11661 1726882389.65431: checking for max_fail_percentage 11661 1726882389.65433: done checking for max_fail_percentage 11661 1726882389.65433: checking to see if all hosts have failed and the running result is not ok 11661 1726882389.65434: done checking to see if all hosts have failed 11661 1726882389.65435: getting the remaining hosts for this loop 11661 1726882389.65436: done getting the remaining hosts for this loop 11661 1726882389.65439: getting the next task for host managed_node2 11661 1726882389.65443: done getting next task for host managed_node2 11661 1726882389.65445: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11661 1726882389.65447: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882389.65449: getting variables 11661 1726882389.65450: in VariableManager get_vars() 11661 1726882389.65468: Calling all_inventory to load vars for managed_node2 11661 1726882389.65470: Calling groups_inventory to load vars for managed_node2 11661 1726882389.65473: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882389.65479: Calling all_plugins_play to load vars for managed_node2 11661 1726882389.65482: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882389.65485: Calling groups_plugins_play to load vars for managed_node2 11661 1726882389.66784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882389.68451: done with get_vars() 11661 1726882389.68479: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:33:09 -0400 (0:00:00.232) 0:00:18.398 ****** 11661 1726882389.68553: entering _queue_task() for managed_node2/include_tasks 11661 1726882389.68877: worker is 1 (out of 1 available) 11661 1726882389.68889: exiting _queue_task() for managed_node2/include_tasks 11661 1726882389.68902: done queuing things up, now waiting for results queue to drain 11661 1726882389.68903: waiting for pending results... 11661 1726882389.69190: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 11661 1726882389.69295: in run() - task 0e448fcc-3ce9-896b-2321-00000000025f 11661 1726882389.69314: variable 'ansible_search_path' from source: unknown 11661 1726882389.69321: variable 'ansible_search_path' from source: unknown 11661 1726882389.69366: calling self._execute() 11661 1726882389.69470: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882389.69485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882389.69503: variable 'omit' from source: magic vars 11661 1726882389.69877: variable 'ansible_distribution_major_version' from source: facts 11661 1726882389.69897: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882389.69909: _execute() done 11661 1726882389.69917: dumping result to json 11661 1726882389.69926: done dumping result, returning 11661 1726882389.69936: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-896b-2321-00000000025f] 11661 1726882389.69947: sending task result for task 0e448fcc-3ce9-896b-2321-00000000025f 11661 1726882389.70078: no more pending results, returning what we have 11661 1726882389.70083: in VariableManager get_vars() 11661 1726882389.70131: Calling all_inventory to load vars for managed_node2 11661 1726882389.70134: Calling groups_inventory to load vars for managed_node2 11661 1726882389.70137: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882389.70151: Calling all_plugins_play to load vars for managed_node2 11661 1726882389.70156: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882389.70159: Calling groups_plugins_play to load vars for managed_node2 11661 1726882389.71185: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000025f 11661 1726882389.71188: WORKER PROCESS EXITING 11661 1726882389.71847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882389.73652: done with get_vars() 11661 1726882389.73674: variable 'ansible_search_path' from source: unknown 11661 1726882389.73676: variable 'ansible_search_path' from source: unknown 11661 1726882389.73715: we have included files to process 11661 1726882389.73716: generating all_blocks data 11661 1726882389.73717: done generating all_blocks data 11661 1726882389.73719: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11661 1726882389.73720: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11661 1726882389.73723: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11661 1726882389.74729: done processing included file 11661 1726882389.74730: iterating over new_blocks loaded from include file 11661 1726882389.74732: in VariableManager get_vars() 11661 1726882389.74752: done with get_vars() 11661 1726882389.74754: filtering new block on tags 11661 1726882389.74781: done filtering new block on tags 11661 1726882389.74784: in VariableManager get_vars() 11661 1726882389.74803: done with get_vars() 11661 1726882389.74805: filtering new block on tags 11661 1726882389.74825: done filtering new block on tags 11661 1726882389.74827: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 11661 1726882389.74832: extending task lists for all hosts with included blocks 11661 1726882389.75001: done extending task lists 11661 1726882389.75002: done processing included files 11661 1726882389.75003: results queue empty 11661 1726882389.75004: checking for any_errors_fatal 11661 1726882389.75008: done checking for any_errors_fatal 11661 1726882389.75009: checking for max_fail_percentage 11661 1726882389.75010: done checking for max_fail_percentage 11661 1726882389.75011: checking to see if all hosts have failed and the running result is not ok 11661 1726882389.75011: done checking to see if all hosts have failed 11661 1726882389.75012: getting the remaining hosts for this loop 11661 1726882389.75013: done getting the remaining hosts for this loop 11661 1726882389.75016: getting the next task for host managed_node2 11661 1726882389.75020: done getting next task for host managed_node2 11661 1726882389.75022: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11661 1726882389.75025: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882389.75027: getting variables 11661 1726882389.75028: in VariableManager get_vars() 11661 1726882389.75091: Calling all_inventory to load vars for managed_node2 11661 1726882389.75094: Calling groups_inventory to load vars for managed_node2 11661 1726882389.75096: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882389.75101: Calling all_plugins_play to load vars for managed_node2 11661 1726882389.75104: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882389.75106: Calling groups_plugins_play to load vars for managed_node2 11661 1726882389.76237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882389.78550: done with get_vars() 11661 1726882389.78775: done getting variables 11661 1726882389.78825: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:33:09 -0400 (0:00:00.103) 0:00:18.502 ****** 11661 1726882389.78859: entering _queue_task() for managed_node2/set_fact 11661 1726882389.79629: worker is 1 (out of 1 available) 11661 1726882389.79642: exiting _queue_task() for managed_node2/set_fact 11661 1726882389.79655: done queuing things up, now waiting for results queue to drain 11661 1726882389.79658: waiting for pending results... 11661 1726882389.79991: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 11661 1726882389.80110: in run() - task 0e448fcc-3ce9-896b-2321-0000000003b0 11661 1726882389.80131: variable 'ansible_search_path' from source: unknown 11661 1726882389.80140: variable 'ansible_search_path' from source: unknown 11661 1726882389.80184: calling self._execute() 11661 1726882389.80292: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882389.80304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882389.80320: variable 'omit' from source: magic vars 11661 1726882389.80710: variable 'ansible_distribution_major_version' from source: facts 11661 1726882389.80729: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882389.80742: variable 'omit' from source: magic vars 11661 1726882389.80800: variable 'omit' from source: magic vars 11661 1726882389.80843: variable 'omit' from source: magic vars 11661 1726882389.80895: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882389.80938: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882389.80966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882389.80993: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882389.81009: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882389.81043: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882389.81053: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882389.81062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882389.81172: Set connection var ansible_connection to ssh 11661 1726882389.81185: Set connection var ansible_pipelining to False 11661 1726882389.81200: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882389.81213: Set connection var ansible_timeout to 10 11661 1726882389.81220: Set connection var ansible_shell_type to sh 11661 1726882389.81232: Set connection var ansible_shell_executable to /bin/sh 11661 1726882389.81259: variable 'ansible_shell_executable' from source: unknown 11661 1726882389.81270: variable 'ansible_connection' from source: unknown 11661 1726882389.81278: variable 'ansible_module_compression' from source: unknown 11661 1726882389.81286: variable 'ansible_shell_type' from source: unknown 11661 1726882389.81294: variable 'ansible_shell_executable' from source: unknown 11661 1726882389.81304: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882389.81313: variable 'ansible_pipelining' from source: unknown 11661 1726882389.81319: variable 'ansible_timeout' from source: unknown 11661 1726882389.81327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882389.81468: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882389.81487: variable 'omit' from source: magic vars 11661 1726882389.81497: starting attempt loop 11661 1726882389.81508: running the handler 11661 1726882389.82225: handler run complete 11661 1726882389.82241: attempt loop complete, returning result 11661 1726882389.82249: _execute() done 11661 1726882389.82255: dumping result to json 11661 1726882389.82265: done dumping result, returning 11661 1726882389.82276: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-896b-2321-0000000003b0] 11661 1726882389.82285: sending task result for task 0e448fcc-3ce9-896b-2321-0000000003b0 11661 1726882389.82392: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000003b0 11661 1726882389.82400: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11661 1726882389.82467: no more pending results, returning what we have 11661 1726882389.82470: results queue empty 11661 1726882389.82471: checking for any_errors_fatal 11661 1726882389.82473: done checking for any_errors_fatal 11661 1726882389.82474: checking for max_fail_percentage 11661 1726882389.82475: done checking for max_fail_percentage 11661 1726882389.82476: checking to see if all hosts have failed and the running result is not ok 11661 1726882389.82477: done checking to see if all hosts have failed 11661 1726882389.82478: getting the remaining hosts for this loop 11661 1726882389.82480: done getting the remaining hosts for this loop 11661 1726882389.82483: getting the next task for host managed_node2 11661 1726882389.82492: done getting next task for host managed_node2 11661 1726882389.82495: ^ task is: TASK: Stat profile file 11661 1726882389.82499: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882389.82504: getting variables 11661 1726882389.82506: in VariableManager get_vars() 11661 1726882389.82547: Calling all_inventory to load vars for managed_node2 11661 1726882389.82550: Calling groups_inventory to load vars for managed_node2 11661 1726882389.82553: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882389.82566: Calling all_plugins_play to load vars for managed_node2 11661 1726882389.82570: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882389.82573: Calling groups_plugins_play to load vars for managed_node2 11661 1726882389.84580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882389.88075: done with get_vars() 11661 1726882389.88108: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:33:09 -0400 (0:00:00.093) 0:00:18.595 ****** 11661 1726882389.88201: entering _queue_task() for managed_node2/stat 11661 1726882389.88979: worker is 1 (out of 1 available) 11661 1726882389.88990: exiting _queue_task() for managed_node2/stat 11661 1726882389.89003: done queuing things up, now waiting for results queue to drain 11661 1726882389.89004: waiting for pending results... 11661 1726882389.89277: running TaskExecutor() for managed_node2/TASK: Stat profile file 11661 1726882389.89401: in run() - task 0e448fcc-3ce9-896b-2321-0000000003b1 11661 1726882389.89419: variable 'ansible_search_path' from source: unknown 11661 1726882389.89427: variable 'ansible_search_path' from source: unknown 11661 1726882389.89472: calling self._execute() 11661 1726882389.89572: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882389.89584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882389.89599: variable 'omit' from source: magic vars 11661 1726882389.90179: variable 'ansible_distribution_major_version' from source: facts 11661 1726882389.90220: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882389.90270: variable 'omit' from source: magic vars 11661 1726882389.90386: variable 'omit' from source: magic vars 11661 1726882389.90613: variable 'profile' from source: include params 11661 1726882389.90646: variable 'item' from source: include params 11661 1726882389.90872: variable 'item' from source: include params 11661 1726882389.90896: variable 'omit' from source: magic vars 11661 1726882389.90945: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882389.91003: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882389.91093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882389.91116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882389.91158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882389.91218: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882389.91301: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882389.91310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882389.91485: Set connection var ansible_connection to ssh 11661 1726882389.91520: Set connection var ansible_pipelining to False 11661 1726882389.91628: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882389.91642: Set connection var ansible_timeout to 10 11661 1726882389.91650: Set connection var ansible_shell_type to sh 11661 1726882389.91667: Set connection var ansible_shell_executable to /bin/sh 11661 1726882389.91695: variable 'ansible_shell_executable' from source: unknown 11661 1726882389.91704: variable 'ansible_connection' from source: unknown 11661 1726882389.91712: variable 'ansible_module_compression' from source: unknown 11661 1726882389.91719: variable 'ansible_shell_type' from source: unknown 11661 1726882389.91731: variable 'ansible_shell_executable' from source: unknown 11661 1726882389.91738: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882389.91747: variable 'ansible_pipelining' from source: unknown 11661 1726882389.91754: variable 'ansible_timeout' from source: unknown 11661 1726882389.91761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882389.92295: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11661 1726882389.92346: variable 'omit' from source: magic vars 11661 1726882389.92358: starting attempt loop 11661 1726882389.92367: running the handler 11661 1726882389.92385: _low_level_execute_command(): starting 11661 1726882389.92453: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882389.94478: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882389.94483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882389.94509: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882389.94513: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882389.94516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882389.94688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882389.94751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882389.94754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882389.94873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882389.96563: stdout chunk (state=3): >>>/root <<< 11661 1726882389.96658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882389.96752: stderr chunk (state=3): >>><<< 11661 1726882389.96755: stdout chunk (state=3): >>><<< 11661 1726882389.96875: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882389.96879: _low_level_execute_command(): starting 11661 1726882389.96882: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882389.9677835-12494-29008330649558 `" && echo ansible-tmp-1726882389.9677835-12494-29008330649558="` echo /root/.ansible/tmp/ansible-tmp-1726882389.9677835-12494-29008330649558 `" ) && sleep 0' 11661 1726882389.98589: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882389.98594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882389.98640: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882389.98651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882389.98654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882389.98711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882389.98721: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882389.98734: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882389.98897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882390.00793: stdout chunk (state=3): >>>ansible-tmp-1726882389.9677835-12494-29008330649558=/root/.ansible/tmp/ansible-tmp-1726882389.9677835-12494-29008330649558 <<< 11661 1726882390.00997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882390.01001: stdout chunk (state=3): >>><<< 11661 1726882390.01008: stderr chunk (state=3): >>><<< 11661 1726882390.01028: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882389.9677835-12494-29008330649558=/root/.ansible/tmp/ansible-tmp-1726882389.9677835-12494-29008330649558 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882390.01085: variable 'ansible_module_compression' from source: unknown 11661 1726882390.01152: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11661 1726882390.01197: variable 'ansible_facts' from source: unknown 11661 1726882390.01274: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882389.9677835-12494-29008330649558/AnsiballZ_stat.py 11661 1726882390.01979: Sending initial data 11661 1726882390.01984: Sent initial data (152 bytes) 11661 1726882390.04145: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882390.04278: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882390.04287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882390.04302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882390.04340: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882390.04346: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882390.04360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882390.04384: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882390.04393: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882390.04400: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882390.04408: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882390.04417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882390.04429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882390.04436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882390.04444: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882390.04453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882390.04643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882390.04691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882390.04710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882390.04854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882390.06689: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882390.06782: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882390.06879: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmp5o8lxae2 /root/.ansible/tmp/ansible-tmp-1726882389.9677835-12494-29008330649558/AnsiballZ_stat.py <<< 11661 1726882390.06965: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882390.08371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882390.08479: stderr chunk (state=3): >>><<< 11661 1726882390.08482: stdout chunk (state=3): >>><<< 11661 1726882390.08485: done transferring module to remote 11661 1726882390.08568: _low_level_execute_command(): starting 11661 1726882390.08573: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882389.9677835-12494-29008330649558/ /root/.ansible/tmp/ansible-tmp-1726882389.9677835-12494-29008330649558/AnsiballZ_stat.py && sleep 0' 11661 1726882390.09754: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882390.09758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882390.09822: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882390.09826: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882390.09891: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882390.09895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882390.10002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882390.11780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882390.11895: stderr chunk (state=3): >>><<< 11661 1726882390.11906: stdout chunk (state=3): >>><<< 11661 1726882390.11961: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882390.11967: _low_level_execute_command(): starting 11661 1726882390.11970: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882389.9677835-12494-29008330649558/AnsiballZ_stat.py && sleep 0' 11661 1726882390.13361: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882390.13504: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882390.13570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882390.13594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882390.13598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882390.13706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882390.26927: stdout chunk (state=3): >>> <<< 11661 1726882390.26932: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11661 1726882390.27960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882390.28021: stderr chunk (state=3): >>><<< 11661 1726882390.28025: stdout chunk (state=3): >>><<< 11661 1726882390.28040: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882390.28066: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882389.9677835-12494-29008330649558/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882390.28074: _low_level_execute_command(): starting 11661 1726882390.28079: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882389.9677835-12494-29008330649558/ > /dev/null 2>&1 && sleep 0' 11661 1726882390.28560: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882390.28567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882390.28611: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882390.28614: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882390.28616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882390.28673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882390.28676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882390.28684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882390.28805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882390.30611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882390.30666: stderr chunk (state=3): >>><<< 11661 1726882390.30672: stdout chunk (state=3): >>><<< 11661 1726882390.30689: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882390.30695: handler run complete 11661 1726882390.30710: attempt loop complete, returning result 11661 1726882390.30713: _execute() done 11661 1726882390.30715: dumping result to json 11661 1726882390.30718: done dumping result, returning 11661 1726882390.30726: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0e448fcc-3ce9-896b-2321-0000000003b1] 11661 1726882390.30730: sending task result for task 0e448fcc-3ce9-896b-2321-0000000003b1 11661 1726882390.30827: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000003b1 11661 1726882390.30830: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 11661 1726882390.30889: no more pending results, returning what we have 11661 1726882390.30892: results queue empty 11661 1726882390.30893: checking for any_errors_fatal 11661 1726882390.30898: done checking for any_errors_fatal 11661 1726882390.30899: checking for max_fail_percentage 11661 1726882390.30901: done checking for max_fail_percentage 11661 1726882390.30901: checking to see if all hosts have failed and the running result is not ok 11661 1726882390.30902: done checking to see if all hosts have failed 11661 1726882390.30903: getting the remaining hosts for this loop 11661 1726882390.30904: done getting the remaining hosts for this loop 11661 1726882390.30908: getting the next task for host managed_node2 11661 1726882390.30914: done getting next task for host managed_node2 11661 1726882390.30917: ^ task is: TASK: Set NM profile exist flag based on the profile files 11661 1726882390.30921: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882390.30924: getting variables 11661 1726882390.30926: in VariableManager get_vars() 11661 1726882390.30972: Calling all_inventory to load vars for managed_node2 11661 1726882390.30975: Calling groups_inventory to load vars for managed_node2 11661 1726882390.30977: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882390.30988: Calling all_plugins_play to load vars for managed_node2 11661 1726882390.30991: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882390.30994: Calling groups_plugins_play to load vars for managed_node2 11661 1726882390.32483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882390.33555: done with get_vars() 11661 1726882390.33575: done getting variables 11661 1726882390.33618: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:33:10 -0400 (0:00:00.454) 0:00:19.049 ****** 11661 1726882390.33641: entering _queue_task() for managed_node2/set_fact 11661 1726882390.33885: worker is 1 (out of 1 available) 11661 1726882390.33897: exiting _queue_task() for managed_node2/set_fact 11661 1726882390.33911: done queuing things up, now waiting for results queue to drain 11661 1726882390.33913: waiting for pending results... 11661 1726882390.34093: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 11661 1726882390.34160: in run() - task 0e448fcc-3ce9-896b-2321-0000000003b2 11661 1726882390.34172: variable 'ansible_search_path' from source: unknown 11661 1726882390.34176: variable 'ansible_search_path' from source: unknown 11661 1726882390.34205: calling self._execute() 11661 1726882390.34282: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882390.34285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882390.34295: variable 'omit' from source: magic vars 11661 1726882390.34703: variable 'ansible_distribution_major_version' from source: facts 11661 1726882390.34718: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882390.34840: variable 'profile_stat' from source: set_fact 11661 1726882390.34860: Evaluated conditional (profile_stat.stat.exists): False 11661 1726882390.34876: when evaluation is False, skipping this task 11661 1726882390.34884: _execute() done 11661 1726882390.34893: dumping result to json 11661 1726882390.34901: done dumping result, returning 11661 1726882390.34910: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-896b-2321-0000000003b2] 11661 1726882390.34919: sending task result for task 0e448fcc-3ce9-896b-2321-0000000003b2 11661 1726882390.35013: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000003b2 11661 1726882390.35016: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11661 1726882390.35283: no more pending results, returning what we have 11661 1726882390.35287: results queue empty 11661 1726882390.35288: checking for any_errors_fatal 11661 1726882390.35294: done checking for any_errors_fatal 11661 1726882390.35295: checking for max_fail_percentage 11661 1726882390.35297: done checking for max_fail_percentage 11661 1726882390.35297: checking to see if all hosts have failed and the running result is not ok 11661 1726882390.35298: done checking to see if all hosts have failed 11661 1726882390.35299: getting the remaining hosts for this loop 11661 1726882390.35300: done getting the remaining hosts for this loop 11661 1726882390.35304: getting the next task for host managed_node2 11661 1726882390.35309: done getting next task for host managed_node2 11661 1726882390.35312: ^ task is: TASK: Get NM profile info 11661 1726882390.35316: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882390.35319: getting variables 11661 1726882390.35320: in VariableManager get_vars() 11661 1726882390.35355: Calling all_inventory to load vars for managed_node2 11661 1726882390.35358: Calling groups_inventory to load vars for managed_node2 11661 1726882390.35361: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882390.35372: Calling all_plugins_play to load vars for managed_node2 11661 1726882390.35375: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882390.35379: Calling groups_plugins_play to load vars for managed_node2 11661 1726882390.36798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882390.39025: done with get_vars() 11661 1726882390.39063: done getting variables 11661 1726882390.39120: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:33:10 -0400 (0:00:00.055) 0:00:19.104 ****** 11661 1726882390.39154: entering _queue_task() for managed_node2/shell 11661 1726882390.39521: worker is 1 (out of 1 available) 11661 1726882390.39534: exiting _queue_task() for managed_node2/shell 11661 1726882390.39546: done queuing things up, now waiting for results queue to drain 11661 1726882390.39547: waiting for pending results... 11661 1726882390.39865: running TaskExecutor() for managed_node2/TASK: Get NM profile info 11661 1726882390.39988: in run() - task 0e448fcc-3ce9-896b-2321-0000000003b3 11661 1726882390.40014: variable 'ansible_search_path' from source: unknown 11661 1726882390.40023: variable 'ansible_search_path' from source: unknown 11661 1726882390.40070: calling self._execute() 11661 1726882390.40180: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882390.40193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882390.40207: variable 'omit' from source: magic vars 11661 1726882390.40618: variable 'ansible_distribution_major_version' from source: facts 11661 1726882390.40634: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882390.40647: variable 'omit' from source: magic vars 11661 1726882390.40711: variable 'omit' from source: magic vars 11661 1726882390.40823: variable 'profile' from source: include params 11661 1726882390.40834: variable 'item' from source: include params 11661 1726882390.40907: variable 'item' from source: include params 11661 1726882390.40934: variable 'omit' from source: magic vars 11661 1726882390.40990: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882390.41037: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882390.41071: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882390.41097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882390.41113: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882390.41152: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882390.41161: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882390.41171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882390.41283: Set connection var ansible_connection to ssh 11661 1726882390.41294: Set connection var ansible_pipelining to False 11661 1726882390.41309: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882390.41322: Set connection var ansible_timeout to 10 11661 1726882390.41328: Set connection var ansible_shell_type to sh 11661 1726882390.41341: Set connection var ansible_shell_executable to /bin/sh 11661 1726882390.41380: variable 'ansible_shell_executable' from source: unknown 11661 1726882390.41388: variable 'ansible_connection' from source: unknown 11661 1726882390.41395: variable 'ansible_module_compression' from source: unknown 11661 1726882390.41402: variable 'ansible_shell_type' from source: unknown 11661 1726882390.41410: variable 'ansible_shell_executable' from source: unknown 11661 1726882390.41418: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882390.41426: variable 'ansible_pipelining' from source: unknown 11661 1726882390.41432: variable 'ansible_timeout' from source: unknown 11661 1726882390.41439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882390.41599: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882390.41616: variable 'omit' from source: magic vars 11661 1726882390.41628: starting attempt loop 11661 1726882390.41638: running the handler 11661 1726882390.41656: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882390.41688: _low_level_execute_command(): starting 11661 1726882390.41701: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882390.42502: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882390.42520: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882390.42535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882390.42563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882390.42609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882390.42626: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882390.42640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882390.42673: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882390.42687: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882390.42699: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882390.42712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882390.42730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882390.42747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882390.42765: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882390.42781: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882390.42797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882390.42881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882390.42902: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882390.42917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882390.43060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882390.44740: stdout chunk (state=3): >>>/root <<< 11661 1726882390.44931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882390.44935: stdout chunk (state=3): >>><<< 11661 1726882390.44937: stderr chunk (state=3): >>><<< 11661 1726882390.45055: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882390.45071: _low_level_execute_command(): starting 11661 1726882390.45075: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882390.449613-12528-204158107478677 `" && echo ansible-tmp-1726882390.449613-12528-204158107478677="` echo /root/.ansible/tmp/ansible-tmp-1726882390.449613-12528-204158107478677 `" ) && sleep 0' 11661 1726882390.45713: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882390.45728: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882390.45743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882390.45767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882390.45810: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882390.45823: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882390.45838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882390.45860: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882390.45881: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882390.45892: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882390.45904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882390.45916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882390.45931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882390.45944: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882390.45957: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882390.45973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882390.46047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882390.46070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882390.46085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882390.46221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882390.48122: stdout chunk (state=3): >>>ansible-tmp-1726882390.449613-12528-204158107478677=/root/.ansible/tmp/ansible-tmp-1726882390.449613-12528-204158107478677 <<< 11661 1726882390.48281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882390.48340: stderr chunk (state=3): >>><<< 11661 1726882390.48343: stdout chunk (state=3): >>><<< 11661 1726882390.48658: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882390.449613-12528-204158107478677=/root/.ansible/tmp/ansible-tmp-1726882390.449613-12528-204158107478677 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882390.48662: variable 'ansible_module_compression' from source: unknown 11661 1726882390.48671: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11661 1726882390.48674: variable 'ansible_facts' from source: unknown 11661 1726882390.48676: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882390.449613-12528-204158107478677/AnsiballZ_command.py 11661 1726882390.48738: Sending initial data 11661 1726882390.48741: Sent initial data (155 bytes) 11661 1726882390.50032: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882390.50079: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882390.50095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882390.50116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882390.50200: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882390.50212: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882390.50283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882390.50302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882390.50315: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882390.50327: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882390.50340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882390.50357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882390.50379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882390.50395: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882390.50407: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882390.50422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882390.50539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882390.50594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882390.50615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882390.50823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882390.52543: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882390.52636: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882390.52739: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpvqvqrsfs /root/.ansible/tmp/ansible-tmp-1726882390.449613-12528-204158107478677/AnsiballZ_command.py <<< 11661 1726882390.52835: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882390.54704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882390.54871: stderr chunk (state=3): >>><<< 11661 1726882390.54874: stdout chunk (state=3): >>><<< 11661 1726882390.54877: done transferring module to remote 11661 1726882390.54879: _low_level_execute_command(): starting 11661 1726882390.54882: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882390.449613-12528-204158107478677/ /root/.ansible/tmp/ansible-tmp-1726882390.449613-12528-204158107478677/AnsiballZ_command.py && sleep 0' 11661 1726882390.55922: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882390.55926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882390.55963: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882390.55968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882390.55970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882390.56027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882390.57192: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882390.57198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882390.57309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882390.59155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882390.59159: stdout chunk (state=3): >>><<< 11661 1726882390.59162: stderr chunk (state=3): >>><<< 11661 1726882390.59184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882390.59187: _low_level_execute_command(): starting 11661 1726882390.59190: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882390.449613-12528-204158107478677/AnsiballZ_command.py && sleep 0' 11661 1726882390.60360: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882390.60932: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882390.60943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882390.60958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882390.61000: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882390.61009: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882390.61019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882390.61049: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882390.61063: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882390.61071: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882390.61082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882390.61091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882390.61102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882390.61110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882390.61116: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882390.61125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882390.61198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882390.61216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882390.61227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882390.61366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882390.77324: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 21:33:10.741888", "end": "2024-09-20 21:33:10.770708", "delta": "0:00:00.028820", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11661 1726882390.78677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882390.78681: stdout chunk (state=3): >>><<< 11661 1726882390.78684: stderr chunk (state=3): >>><<< 11661 1726882390.78709: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 21:33:10.741888", "end": "2024-09-20 21:33:10.770708", "delta": "0:00:00.028820", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882390.78749: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882390.449613-12528-204158107478677/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882390.78755: _low_level_execute_command(): starting 11661 1726882390.78761: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882390.449613-12528-204158107478677/ > /dev/null 2>&1 && sleep 0' 11661 1726882390.80072: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882390.80077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882390.80181: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882390.80185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 11661 1726882390.80260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882390.80268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882390.80357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882390.80484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882390.80614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882390.82515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882390.82520: stdout chunk (state=3): >>><<< 11661 1726882390.82525: stderr chunk (state=3): >>><<< 11661 1726882390.82548: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882390.82555: handler run complete 11661 1726882390.82579: Evaluated conditional (False): False 11661 1726882390.82587: attempt loop complete, returning result 11661 1726882390.82590: _execute() done 11661 1726882390.82593: dumping result to json 11661 1726882390.82597: done dumping result, returning 11661 1726882390.82605: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0e448fcc-3ce9-896b-2321-0000000003b3] 11661 1726882390.82610: sending task result for task 0e448fcc-3ce9-896b-2321-0000000003b3 11661 1726882390.82710: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000003b3 11661 1726882390.82713: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.028820", "end": "2024-09-20 21:33:10.770708", "rc": 0, "start": "2024-09-20 21:33:10.741888" } STDOUT: bond0 /etc/NetworkManager/system-connections/bond0.nmconnection bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 11661 1726882390.82836: no more pending results, returning what we have 11661 1726882390.82840: results queue empty 11661 1726882390.82841: checking for any_errors_fatal 11661 1726882390.82846: done checking for any_errors_fatal 11661 1726882390.82847: checking for max_fail_percentage 11661 1726882390.82848: done checking for max_fail_percentage 11661 1726882390.82849: checking to see if all hosts have failed and the running result is not ok 11661 1726882390.82850: done checking to see if all hosts have failed 11661 1726882390.82851: getting the remaining hosts for this loop 11661 1726882390.82852: done getting the remaining hosts for this loop 11661 1726882390.82856: getting the next task for host managed_node2 11661 1726882390.82863: done getting next task for host managed_node2 11661 1726882390.82873: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11661 1726882390.82877: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882390.82881: getting variables 11661 1726882390.82882: in VariableManager get_vars() 11661 1726882390.82923: Calling all_inventory to load vars for managed_node2 11661 1726882390.82926: Calling groups_inventory to load vars for managed_node2 11661 1726882390.82928: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882390.82939: Calling all_plugins_play to load vars for managed_node2 11661 1726882390.82941: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882390.82945: Calling groups_plugins_play to load vars for managed_node2 11661 1726882390.85891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882390.89585: done with get_vars() 11661 1726882390.89731: done getting variables 11661 1726882390.89793: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:33:10 -0400 (0:00:00.507) 0:00:19.612 ****** 11661 1726882390.89944: entering _queue_task() for managed_node2/set_fact 11661 1726882390.90592: worker is 1 (out of 1 available) 11661 1726882390.90605: exiting _queue_task() for managed_node2/set_fact 11661 1726882390.90617: done queuing things up, now waiting for results queue to drain 11661 1726882390.90618: waiting for pending results... 11661 1726882390.91544: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11661 1726882390.91658: in run() - task 0e448fcc-3ce9-896b-2321-0000000003b4 11661 1726882390.91682: variable 'ansible_search_path' from source: unknown 11661 1726882390.91692: variable 'ansible_search_path' from source: unknown 11661 1726882390.91733: calling self._execute() 11661 1726882390.92188: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882390.92203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882390.92217: variable 'omit' from source: magic vars 11661 1726882390.92601: variable 'ansible_distribution_major_version' from source: facts 11661 1726882390.92623: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882390.92768: variable 'nm_profile_exists' from source: set_fact 11661 1726882390.92789: Evaluated conditional (nm_profile_exists.rc == 0): True 11661 1726882390.92800: variable 'omit' from source: magic vars 11661 1726882390.92861: variable 'omit' from source: magic vars 11661 1726882390.92900: variable 'omit' from source: magic vars 11661 1726882390.92956: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882390.92996: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882390.93022: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882390.93046: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882390.93068: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882390.93104: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882390.93113: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882390.93120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882390.93226: Set connection var ansible_connection to ssh 11661 1726882390.93236: Set connection var ansible_pipelining to False 11661 1726882390.93245: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882390.93261: Set connection var ansible_timeout to 10 11661 1726882390.93272: Set connection var ansible_shell_type to sh 11661 1726882390.93284: Set connection var ansible_shell_executable to /bin/sh 11661 1726882390.93310: variable 'ansible_shell_executable' from source: unknown 11661 1726882390.93318: variable 'ansible_connection' from source: unknown 11661 1726882390.93324: variable 'ansible_module_compression' from source: unknown 11661 1726882390.93330: variable 'ansible_shell_type' from source: unknown 11661 1726882390.93336: variable 'ansible_shell_executable' from source: unknown 11661 1726882390.93343: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882390.93353: variable 'ansible_pipelining' from source: unknown 11661 1726882390.93360: variable 'ansible_timeout' from source: unknown 11661 1726882390.93371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882390.93515: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882390.93533: variable 'omit' from source: magic vars 11661 1726882390.93543: starting attempt loop 11661 1726882390.93549: running the handler 11661 1726882390.93571: handler run complete 11661 1726882390.93586: attempt loop complete, returning result 11661 1726882390.93597: _execute() done 11661 1726882390.93604: dumping result to json 11661 1726882390.93611: done dumping result, returning 11661 1726882390.93622: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-896b-2321-0000000003b4] 11661 1726882390.93631: sending task result for task 0e448fcc-3ce9-896b-2321-0000000003b4 11661 1726882390.93735: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000003b4 ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11661 1726882390.93791: no more pending results, returning what we have 11661 1726882390.93795: results queue empty 11661 1726882390.93795: checking for any_errors_fatal 11661 1726882390.93804: done checking for any_errors_fatal 11661 1726882390.93805: checking for max_fail_percentage 11661 1726882390.93806: done checking for max_fail_percentage 11661 1726882390.93807: checking to see if all hosts have failed and the running result is not ok 11661 1726882390.93808: done checking to see if all hosts have failed 11661 1726882390.93809: getting the remaining hosts for this loop 11661 1726882390.93810: done getting the remaining hosts for this loop 11661 1726882390.93813: getting the next task for host managed_node2 11661 1726882390.93822: done getting next task for host managed_node2 11661 1726882390.93824: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11661 1726882390.93830: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882390.93835: getting variables 11661 1726882390.93836: in VariableManager get_vars() 11661 1726882390.93878: Calling all_inventory to load vars for managed_node2 11661 1726882390.93880: Calling groups_inventory to load vars for managed_node2 11661 1726882390.93882: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882390.93893: Calling all_plugins_play to load vars for managed_node2 11661 1726882390.93896: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882390.93899: Calling groups_plugins_play to load vars for managed_node2 11661 1726882390.94417: WORKER PROCESS EXITING 11661 1726882390.96372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882391.00026: done with get_vars() 11661 1726882391.00059: done getting variables 11661 1726882391.00122: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882391.00483: variable 'profile' from source: include params 11661 1726882391.00487: variable 'item' from source: include params 11661 1726882391.00546: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:33:11 -0400 (0:00:00.107) 0:00:19.720 ****** 11661 1726882391.00693: entering _queue_task() for managed_node2/command 11661 1726882391.01441: worker is 1 (out of 1 available) 11661 1726882391.01469: exiting _queue_task() for managed_node2/command 11661 1726882391.01487: done queuing things up, now waiting for results queue to drain 11661 1726882391.01489: waiting for pending results... 11661 1726882391.01804: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0 11661 1726882391.01935: in run() - task 0e448fcc-3ce9-896b-2321-0000000003b6 11661 1726882391.01960: variable 'ansible_search_path' from source: unknown 11661 1726882391.01970: variable 'ansible_search_path' from source: unknown 11661 1726882391.02010: calling self._execute() 11661 1726882391.02115: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.02127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.02141: variable 'omit' from source: magic vars 11661 1726882391.02522: variable 'ansible_distribution_major_version' from source: facts 11661 1726882391.02540: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882391.02676: variable 'profile_stat' from source: set_fact 11661 1726882391.02699: Evaluated conditional (profile_stat.stat.exists): False 11661 1726882391.02707: when evaluation is False, skipping this task 11661 1726882391.02714: _execute() done 11661 1726882391.02722: dumping result to json 11661 1726882391.02729: done dumping result, returning 11661 1726882391.02739: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0 [0e448fcc-3ce9-896b-2321-0000000003b6] 11661 1726882391.02752: sending task result for task 0e448fcc-3ce9-896b-2321-0000000003b6 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11661 1726882391.02903: no more pending results, returning what we have 11661 1726882391.02908: results queue empty 11661 1726882391.02909: checking for any_errors_fatal 11661 1726882391.02918: done checking for any_errors_fatal 11661 1726882391.02918: checking for max_fail_percentage 11661 1726882391.02920: done checking for max_fail_percentage 11661 1726882391.02921: checking to see if all hosts have failed and the running result is not ok 11661 1726882391.02922: done checking to see if all hosts have failed 11661 1726882391.02923: getting the remaining hosts for this loop 11661 1726882391.02925: done getting the remaining hosts for this loop 11661 1726882391.02928: getting the next task for host managed_node2 11661 1726882391.02936: done getting next task for host managed_node2 11661 1726882391.02939: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11661 1726882391.02944: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882391.02948: getting variables 11661 1726882391.02949: in VariableManager get_vars() 11661 1726882391.02997: Calling all_inventory to load vars for managed_node2 11661 1726882391.03001: Calling groups_inventory to load vars for managed_node2 11661 1726882391.03003: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882391.03016: Calling all_plugins_play to load vars for managed_node2 11661 1726882391.03019: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882391.03022: Calling groups_plugins_play to load vars for managed_node2 11661 1726882391.03696: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000003b6 11661 1726882391.03699: WORKER PROCESS EXITING 11661 1726882391.04797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882391.08307: done with get_vars() 11661 1726882391.08342: done getting variables 11661 1726882391.08417: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882391.08682: variable 'profile' from source: include params 11661 1726882391.08686: variable 'item' from source: include params 11661 1726882391.08882: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:33:11 -0400 (0:00:00.082) 0:00:19.802 ****** 11661 1726882391.08915: entering _queue_task() for managed_node2/set_fact 11661 1726882391.09991: worker is 1 (out of 1 available) 11661 1726882391.10058: exiting _queue_task() for managed_node2/set_fact 11661 1726882391.10082: done queuing things up, now waiting for results queue to drain 11661 1726882391.10084: waiting for pending results... 11661 1726882391.10776: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0 11661 1726882391.11020: in run() - task 0e448fcc-3ce9-896b-2321-0000000003b7 11661 1726882391.11042: variable 'ansible_search_path' from source: unknown 11661 1726882391.11177: variable 'ansible_search_path' from source: unknown 11661 1726882391.11220: calling self._execute() 11661 1726882391.11440: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.11454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.11473: variable 'omit' from source: magic vars 11661 1726882391.12490: variable 'ansible_distribution_major_version' from source: facts 11661 1726882391.12510: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882391.12742: variable 'profile_stat' from source: set_fact 11661 1726882391.12771: Evaluated conditional (profile_stat.stat.exists): False 11661 1726882391.12779: when evaluation is False, skipping this task 11661 1726882391.12786: _execute() done 11661 1726882391.12799: dumping result to json 11661 1726882391.12811: done dumping result, returning 11661 1726882391.12821: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0 [0e448fcc-3ce9-896b-2321-0000000003b7] 11661 1726882391.12830: sending task result for task 0e448fcc-3ce9-896b-2321-0000000003b7 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11661 1726882391.12987: no more pending results, returning what we have 11661 1726882391.12992: results queue empty 11661 1726882391.12993: checking for any_errors_fatal 11661 1726882391.12999: done checking for any_errors_fatal 11661 1726882391.12999: checking for max_fail_percentage 11661 1726882391.13002: done checking for max_fail_percentage 11661 1726882391.13002: checking to see if all hosts have failed and the running result is not ok 11661 1726882391.13003: done checking to see if all hosts have failed 11661 1726882391.13004: getting the remaining hosts for this loop 11661 1726882391.13006: done getting the remaining hosts for this loop 11661 1726882391.13009: getting the next task for host managed_node2 11661 1726882391.13018: done getting next task for host managed_node2 11661 1726882391.13021: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11661 1726882391.13026: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882391.13031: getting variables 11661 1726882391.13033: in VariableManager get_vars() 11661 1726882391.13090: Calling all_inventory to load vars for managed_node2 11661 1726882391.13093: Calling groups_inventory to load vars for managed_node2 11661 1726882391.13095: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882391.13109: Calling all_plugins_play to load vars for managed_node2 11661 1726882391.13112: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882391.13115: Calling groups_plugins_play to load vars for managed_node2 11661 1726882391.14166: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000003b7 11661 1726882391.14170: WORKER PROCESS EXITING 11661 1726882391.15001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882391.16370: done with get_vars() 11661 1726882391.16392: done getting variables 11661 1726882391.16437: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882391.16525: variable 'profile' from source: include params 11661 1726882391.16528: variable 'item' from source: include params 11661 1726882391.16572: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:33:11 -0400 (0:00:00.076) 0:00:19.879 ****** 11661 1726882391.16596: entering _queue_task() for managed_node2/command 11661 1726882391.16836: worker is 1 (out of 1 available) 11661 1726882391.16854: exiting _queue_task() for managed_node2/command 11661 1726882391.16868: done queuing things up, now waiting for results queue to drain 11661 1726882391.16870: waiting for pending results... 11661 1726882391.17034: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0 11661 1726882391.17117: in run() - task 0e448fcc-3ce9-896b-2321-0000000003b8 11661 1726882391.17129: variable 'ansible_search_path' from source: unknown 11661 1726882391.17133: variable 'ansible_search_path' from source: unknown 11661 1726882391.17161: calling self._execute() 11661 1726882391.17241: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.17245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.17255: variable 'omit' from source: magic vars 11661 1726882391.17566: variable 'ansible_distribution_major_version' from source: facts 11661 1726882391.17577: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882391.17697: variable 'profile_stat' from source: set_fact 11661 1726882391.17715: Evaluated conditional (profile_stat.stat.exists): False 11661 1726882391.17723: when evaluation is False, skipping this task 11661 1726882391.17729: _execute() done 11661 1726882391.17737: dumping result to json 11661 1726882391.17744: done dumping result, returning 11661 1726882391.17758: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0 [0e448fcc-3ce9-896b-2321-0000000003b8] 11661 1726882391.17772: sending task result for task 0e448fcc-3ce9-896b-2321-0000000003b8 11661 1726882391.17871: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000003b8 11661 1726882391.17873: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11661 1726882391.17969: no more pending results, returning what we have 11661 1726882391.17974: results queue empty 11661 1726882391.17975: checking for any_errors_fatal 11661 1726882391.17983: done checking for any_errors_fatal 11661 1726882391.17983: checking for max_fail_percentage 11661 1726882391.17986: done checking for max_fail_percentage 11661 1726882391.17987: checking to see if all hosts have failed and the running result is not ok 11661 1726882391.17988: done checking to see if all hosts have failed 11661 1726882391.17989: getting the remaining hosts for this loop 11661 1726882391.17990: done getting the remaining hosts for this loop 11661 1726882391.17994: getting the next task for host managed_node2 11661 1726882391.18002: done getting next task for host managed_node2 11661 1726882391.18004: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11661 1726882391.18009: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882391.18015: getting variables 11661 1726882391.18024: in VariableManager get_vars() 11661 1726882391.18073: Calling all_inventory to load vars for managed_node2 11661 1726882391.18076: Calling groups_inventory to load vars for managed_node2 11661 1726882391.18079: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882391.18092: Calling all_plugins_play to load vars for managed_node2 11661 1726882391.18094: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882391.18097: Calling groups_plugins_play to load vars for managed_node2 11661 1726882391.19402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882391.20345: done with get_vars() 11661 1726882391.20369: done getting variables 11661 1726882391.20417: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882391.20505: variable 'profile' from source: include params 11661 1726882391.20507: variable 'item' from source: include params 11661 1726882391.20548: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:33:11 -0400 (0:00:00.039) 0:00:19.919 ****** 11661 1726882391.20576: entering _queue_task() for managed_node2/set_fact 11661 1726882391.21308: worker is 1 (out of 1 available) 11661 1726882391.21315: exiting _queue_task() for managed_node2/set_fact 11661 1726882391.21326: done queuing things up, now waiting for results queue to drain 11661 1726882391.21327: waiting for pending results... 11661 1726882391.21347: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0 11661 1726882391.21355: in run() - task 0e448fcc-3ce9-896b-2321-0000000003b9 11661 1726882391.21358: variable 'ansible_search_path' from source: unknown 11661 1726882391.21361: variable 'ansible_search_path' from source: unknown 11661 1726882391.21366: calling self._execute() 11661 1726882391.21468: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.21472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.21484: variable 'omit' from source: magic vars 11661 1726882391.21857: variable 'ansible_distribution_major_version' from source: facts 11661 1726882391.21872: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882391.21993: variable 'profile_stat' from source: set_fact 11661 1726882391.22007: Evaluated conditional (profile_stat.stat.exists): False 11661 1726882391.22010: when evaluation is False, skipping this task 11661 1726882391.22013: _execute() done 11661 1726882391.22015: dumping result to json 11661 1726882391.22019: done dumping result, returning 11661 1726882391.22022: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0 [0e448fcc-3ce9-896b-2321-0000000003b9] 11661 1726882391.22029: sending task result for task 0e448fcc-3ce9-896b-2321-0000000003b9 11661 1726882391.22119: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000003b9 11661 1726882391.22122: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11661 1726882391.22180: no more pending results, returning what we have 11661 1726882391.22185: results queue empty 11661 1726882391.22185: checking for any_errors_fatal 11661 1726882391.22191: done checking for any_errors_fatal 11661 1726882391.22192: checking for max_fail_percentage 11661 1726882391.22194: done checking for max_fail_percentage 11661 1726882391.22195: checking to see if all hosts have failed and the running result is not ok 11661 1726882391.22196: done checking to see if all hosts have failed 11661 1726882391.22196: getting the remaining hosts for this loop 11661 1726882391.22198: done getting the remaining hosts for this loop 11661 1726882391.22201: getting the next task for host managed_node2 11661 1726882391.22209: done getting next task for host managed_node2 11661 1726882391.22212: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11661 1726882391.22215: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882391.22219: getting variables 11661 1726882391.22221: in VariableManager get_vars() 11661 1726882391.22262: Calling all_inventory to load vars for managed_node2 11661 1726882391.22267: Calling groups_inventory to load vars for managed_node2 11661 1726882391.22269: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882391.22278: Calling all_plugins_play to load vars for managed_node2 11661 1726882391.22281: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882391.22284: Calling groups_plugins_play to load vars for managed_node2 11661 1726882391.23582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882391.24508: done with get_vars() 11661 1726882391.24525: done getting variables 11661 1726882391.24573: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882391.24658: variable 'profile' from source: include params 11661 1726882391.24661: variable 'item' from source: include params 11661 1726882391.24704: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:33:11 -0400 (0:00:00.041) 0:00:19.960 ****** 11661 1726882391.24727: entering _queue_task() for managed_node2/assert 11661 1726882391.24956: worker is 1 (out of 1 available) 11661 1726882391.24972: exiting _queue_task() for managed_node2/assert 11661 1726882391.24985: done queuing things up, now waiting for results queue to drain 11661 1726882391.24987: waiting for pending results... 11661 1726882391.25194: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0' 11661 1726882391.25296: in run() - task 0e448fcc-3ce9-896b-2321-000000000260 11661 1726882391.25311: variable 'ansible_search_path' from source: unknown 11661 1726882391.25316: variable 'ansible_search_path' from source: unknown 11661 1726882391.25346: calling self._execute() 11661 1726882391.25439: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.25447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.25458: variable 'omit' from source: magic vars 11661 1726882391.26375: variable 'ansible_distribution_major_version' from source: facts 11661 1726882391.26378: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882391.26381: variable 'omit' from source: magic vars 11661 1726882391.26384: variable 'omit' from source: magic vars 11661 1726882391.26386: variable 'profile' from source: include params 11661 1726882391.26388: variable 'item' from source: include params 11661 1726882391.26390: variable 'item' from source: include params 11661 1726882391.26393: variable 'omit' from source: magic vars 11661 1726882391.26395: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882391.26398: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882391.26399: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882391.26402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882391.26404: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882391.26406: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882391.26408: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.26411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.26413: Set connection var ansible_connection to ssh 11661 1726882391.26415: Set connection var ansible_pipelining to False 11661 1726882391.26417: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882391.26420: Set connection var ansible_timeout to 10 11661 1726882391.26422: Set connection var ansible_shell_type to sh 11661 1726882391.26424: Set connection var ansible_shell_executable to /bin/sh 11661 1726882391.26427: variable 'ansible_shell_executable' from source: unknown 11661 1726882391.26429: variable 'ansible_connection' from source: unknown 11661 1726882391.26431: variable 'ansible_module_compression' from source: unknown 11661 1726882391.26433: variable 'ansible_shell_type' from source: unknown 11661 1726882391.26435: variable 'ansible_shell_executable' from source: unknown 11661 1726882391.26437: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.26439: variable 'ansible_pipelining' from source: unknown 11661 1726882391.26441: variable 'ansible_timeout' from source: unknown 11661 1726882391.26443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.26684: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882391.26688: variable 'omit' from source: magic vars 11661 1726882391.26691: starting attempt loop 11661 1726882391.26693: running the handler 11661 1726882391.26793: variable 'lsr_net_profile_exists' from source: set_fact 11661 1726882391.26796: Evaluated conditional (lsr_net_profile_exists): True 11661 1726882391.26799: handler run complete 11661 1726882391.26803: attempt loop complete, returning result 11661 1726882391.26805: _execute() done 11661 1726882391.26807: dumping result to json 11661 1726882391.26809: done dumping result, returning 11661 1726882391.26810: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0' [0e448fcc-3ce9-896b-2321-000000000260] 11661 1726882391.26812: sending task result for task 0e448fcc-3ce9-896b-2321-000000000260 11661 1726882391.26871: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000260 11661 1726882391.26874: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11661 1726882391.26926: no more pending results, returning what we have 11661 1726882391.26930: results queue empty 11661 1726882391.26930: checking for any_errors_fatal 11661 1726882391.26936: done checking for any_errors_fatal 11661 1726882391.26937: checking for max_fail_percentage 11661 1726882391.26938: done checking for max_fail_percentage 11661 1726882391.26939: checking to see if all hosts have failed and the running result is not ok 11661 1726882391.26940: done checking to see if all hosts have failed 11661 1726882391.26941: getting the remaining hosts for this loop 11661 1726882391.26942: done getting the remaining hosts for this loop 11661 1726882391.26946: getting the next task for host managed_node2 11661 1726882391.26951: done getting next task for host managed_node2 11661 1726882391.26954: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11661 1726882391.26956: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882391.26960: getting variables 11661 1726882391.26962: in VariableManager get_vars() 11661 1726882391.26997: Calling all_inventory to load vars for managed_node2 11661 1726882391.27000: Calling groups_inventory to load vars for managed_node2 11661 1726882391.27009: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882391.27018: Calling all_plugins_play to load vars for managed_node2 11661 1726882391.27020: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882391.27023: Calling groups_plugins_play to load vars for managed_node2 11661 1726882391.28337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882391.29615: done with get_vars() 11661 1726882391.29635: done getting variables 11661 1726882391.29686: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882391.29778: variable 'profile' from source: include params 11661 1726882391.29781: variable 'item' from source: include params 11661 1726882391.29820: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:33:11 -0400 (0:00:00.051) 0:00:20.011 ****** 11661 1726882391.29852: entering _queue_task() for managed_node2/assert 11661 1726882391.30092: worker is 1 (out of 1 available) 11661 1726882391.30107: exiting _queue_task() for managed_node2/assert 11661 1726882391.30120: done queuing things up, now waiting for results queue to drain 11661 1726882391.30121: waiting for pending results... 11661 1726882391.30311: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0' 11661 1726882391.30387: in run() - task 0e448fcc-3ce9-896b-2321-000000000261 11661 1726882391.30400: variable 'ansible_search_path' from source: unknown 11661 1726882391.30404: variable 'ansible_search_path' from source: unknown 11661 1726882391.30431: calling self._execute() 11661 1726882391.30510: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.30514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.30523: variable 'omit' from source: magic vars 11661 1726882391.30976: variable 'ansible_distribution_major_version' from source: facts 11661 1726882391.30994: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882391.31005: variable 'omit' from source: magic vars 11661 1726882391.31056: variable 'omit' from source: magic vars 11661 1726882391.31177: variable 'profile' from source: include params 11661 1726882391.31187: variable 'item' from source: include params 11661 1726882391.31251: variable 'item' from source: include params 11661 1726882391.31285: variable 'omit' from source: magic vars 11661 1726882391.31330: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882391.31378: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882391.31429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882391.31468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882391.31525: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882391.31583: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882391.31598: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.31606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.31730: Set connection var ansible_connection to ssh 11661 1726882391.31734: Set connection var ansible_pipelining to False 11661 1726882391.31769: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882391.31773: Set connection var ansible_timeout to 10 11661 1726882391.31775: Set connection var ansible_shell_type to sh 11661 1726882391.31778: Set connection var ansible_shell_executable to /bin/sh 11661 1726882391.31798: variable 'ansible_shell_executable' from source: unknown 11661 1726882391.31801: variable 'ansible_connection' from source: unknown 11661 1726882391.31805: variable 'ansible_module_compression' from source: unknown 11661 1726882391.31808: variable 'ansible_shell_type' from source: unknown 11661 1726882391.31810: variable 'ansible_shell_executable' from source: unknown 11661 1726882391.31812: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.31814: variable 'ansible_pipelining' from source: unknown 11661 1726882391.31817: variable 'ansible_timeout' from source: unknown 11661 1726882391.31825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.31922: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882391.31933: variable 'omit' from source: magic vars 11661 1726882391.31937: starting attempt loop 11661 1726882391.31940: running the handler 11661 1726882391.32024: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11661 1726882391.32027: Evaluated conditional (lsr_net_profile_ansible_managed): True 11661 1726882391.32034: handler run complete 11661 1726882391.32047: attempt loop complete, returning result 11661 1726882391.32051: _execute() done 11661 1726882391.32053: dumping result to json 11661 1726882391.32056: done dumping result, returning 11661 1726882391.32065: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0' [0e448fcc-3ce9-896b-2321-000000000261] 11661 1726882391.32073: sending task result for task 0e448fcc-3ce9-896b-2321-000000000261 11661 1726882391.32154: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000261 11661 1726882391.32157: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11661 1726882391.32210: no more pending results, returning what we have 11661 1726882391.32213: results queue empty 11661 1726882391.32214: checking for any_errors_fatal 11661 1726882391.32221: done checking for any_errors_fatal 11661 1726882391.32221: checking for max_fail_percentage 11661 1726882391.32223: done checking for max_fail_percentage 11661 1726882391.32224: checking to see if all hosts have failed and the running result is not ok 11661 1726882391.32224: done checking to see if all hosts have failed 11661 1726882391.32225: getting the remaining hosts for this loop 11661 1726882391.32227: done getting the remaining hosts for this loop 11661 1726882391.32230: getting the next task for host managed_node2 11661 1726882391.32236: done getting next task for host managed_node2 11661 1726882391.32238: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11661 1726882391.32241: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882391.32245: getting variables 11661 1726882391.32247: in VariableManager get_vars() 11661 1726882391.32289: Calling all_inventory to load vars for managed_node2 11661 1726882391.32292: Calling groups_inventory to load vars for managed_node2 11661 1726882391.32295: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882391.32304: Calling all_plugins_play to load vars for managed_node2 11661 1726882391.32307: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882391.32309: Calling groups_plugins_play to load vars for managed_node2 11661 1726882391.33137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882391.34077: done with get_vars() 11661 1726882391.34100: done getting variables 11661 1726882391.34148: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882391.34235: variable 'profile' from source: include params 11661 1726882391.34238: variable 'item' from source: include params 11661 1726882391.34280: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:33:11 -0400 (0:00:00.044) 0:00:20.056 ****** 11661 1726882391.34308: entering _queue_task() for managed_node2/assert 11661 1726882391.34542: worker is 1 (out of 1 available) 11661 1726882391.34555: exiting _queue_task() for managed_node2/assert 11661 1726882391.34569: done queuing things up, now waiting for results queue to drain 11661 1726882391.34571: waiting for pending results... 11661 1726882391.34755: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0 11661 1726882391.34834: in run() - task 0e448fcc-3ce9-896b-2321-000000000262 11661 1726882391.34845: variable 'ansible_search_path' from source: unknown 11661 1726882391.34849: variable 'ansible_search_path' from source: unknown 11661 1726882391.34884: calling self._execute() 11661 1726882391.34960: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.34965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.34976: variable 'omit' from source: magic vars 11661 1726882391.35241: variable 'ansible_distribution_major_version' from source: facts 11661 1726882391.35251: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882391.35259: variable 'omit' from source: magic vars 11661 1726882391.35287: variable 'omit' from source: magic vars 11661 1726882391.35357: variable 'profile' from source: include params 11661 1726882391.35360: variable 'item' from source: include params 11661 1726882391.35406: variable 'item' from source: include params 11661 1726882391.35422: variable 'omit' from source: magic vars 11661 1726882391.35459: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882391.35487: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882391.35504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882391.35517: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882391.35529: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882391.35554: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882391.35557: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.35560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.35626: Set connection var ansible_connection to ssh 11661 1726882391.35631: Set connection var ansible_pipelining to False 11661 1726882391.35640: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882391.35643: Set connection var ansible_timeout to 10 11661 1726882391.35645: Set connection var ansible_shell_type to sh 11661 1726882391.35652: Set connection var ansible_shell_executable to /bin/sh 11661 1726882391.35673: variable 'ansible_shell_executable' from source: unknown 11661 1726882391.35677: variable 'ansible_connection' from source: unknown 11661 1726882391.35679: variable 'ansible_module_compression' from source: unknown 11661 1726882391.35681: variable 'ansible_shell_type' from source: unknown 11661 1726882391.35683: variable 'ansible_shell_executable' from source: unknown 11661 1726882391.35685: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.35690: variable 'ansible_pipelining' from source: unknown 11661 1726882391.35692: variable 'ansible_timeout' from source: unknown 11661 1726882391.35696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.35798: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882391.35808: variable 'omit' from source: magic vars 11661 1726882391.35813: starting attempt loop 11661 1726882391.35815: running the handler 11661 1726882391.35892: variable 'lsr_net_profile_fingerprint' from source: set_fact 11661 1726882391.35896: Evaluated conditional (lsr_net_profile_fingerprint): True 11661 1726882391.35902: handler run complete 11661 1726882391.35913: attempt loop complete, returning result 11661 1726882391.35916: _execute() done 11661 1726882391.35919: dumping result to json 11661 1726882391.35921: done dumping result, returning 11661 1726882391.35927: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0 [0e448fcc-3ce9-896b-2321-000000000262] 11661 1726882391.35933: sending task result for task 0e448fcc-3ce9-896b-2321-000000000262 11661 1726882391.36016: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000262 11661 1726882391.36019: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11661 1726882391.36073: no more pending results, returning what we have 11661 1726882391.36077: results queue empty 11661 1726882391.36078: checking for any_errors_fatal 11661 1726882391.36084: done checking for any_errors_fatal 11661 1726882391.36085: checking for max_fail_percentage 11661 1726882391.36087: done checking for max_fail_percentage 11661 1726882391.36087: checking to see if all hosts have failed and the running result is not ok 11661 1726882391.36088: done checking to see if all hosts have failed 11661 1726882391.36089: getting the remaining hosts for this loop 11661 1726882391.36090: done getting the remaining hosts for this loop 11661 1726882391.36094: getting the next task for host managed_node2 11661 1726882391.36103: done getting next task for host managed_node2 11661 1726882391.36106: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11661 1726882391.36109: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882391.36113: getting variables 11661 1726882391.36115: in VariableManager get_vars() 11661 1726882391.36154: Calling all_inventory to load vars for managed_node2 11661 1726882391.36157: Calling groups_inventory to load vars for managed_node2 11661 1726882391.36159: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882391.36169: Calling all_plugins_play to load vars for managed_node2 11661 1726882391.36172: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882391.36174: Calling groups_plugins_play to load vars for managed_node2 11661 1726882391.37082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882391.37996: done with get_vars() 11661 1726882391.38013: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:33:11 -0400 (0:00:00.037) 0:00:20.094 ****** 11661 1726882391.38081: entering _queue_task() for managed_node2/include_tasks 11661 1726882391.38300: worker is 1 (out of 1 available) 11661 1726882391.38313: exiting _queue_task() for managed_node2/include_tasks 11661 1726882391.38326: done queuing things up, now waiting for results queue to drain 11661 1726882391.38327: waiting for pending results... 11661 1726882391.38500: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 11661 1726882391.38576: in run() - task 0e448fcc-3ce9-896b-2321-000000000266 11661 1726882391.38585: variable 'ansible_search_path' from source: unknown 11661 1726882391.38588: variable 'ansible_search_path' from source: unknown 11661 1726882391.38615: calling self._execute() 11661 1726882391.38691: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.38696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.38704: variable 'omit' from source: magic vars 11661 1726882391.38974: variable 'ansible_distribution_major_version' from source: facts 11661 1726882391.38985: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882391.38989: _execute() done 11661 1726882391.38995: dumping result to json 11661 1726882391.38998: done dumping result, returning 11661 1726882391.39005: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-896b-2321-000000000266] 11661 1726882391.39007: sending task result for task 0e448fcc-3ce9-896b-2321-000000000266 11661 1726882391.39089: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000266 11661 1726882391.39093: WORKER PROCESS EXITING 11661 1726882391.39144: no more pending results, returning what we have 11661 1726882391.39149: in VariableManager get_vars() 11661 1726882391.39189: Calling all_inventory to load vars for managed_node2 11661 1726882391.39191: Calling groups_inventory to load vars for managed_node2 11661 1726882391.39193: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882391.39203: Calling all_plugins_play to load vars for managed_node2 11661 1726882391.39206: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882391.39209: Calling groups_plugins_play to load vars for managed_node2 11661 1726882391.40006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882391.41009: done with get_vars() 11661 1726882391.41022: variable 'ansible_search_path' from source: unknown 11661 1726882391.41023: variable 'ansible_search_path' from source: unknown 11661 1726882391.41049: we have included files to process 11661 1726882391.41050: generating all_blocks data 11661 1726882391.41052: done generating all_blocks data 11661 1726882391.41056: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11661 1726882391.41057: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11661 1726882391.41059: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11661 1726882391.41655: done processing included file 11661 1726882391.41658: iterating over new_blocks loaded from include file 11661 1726882391.41659: in VariableManager get_vars() 11661 1726882391.41674: done with get_vars() 11661 1726882391.41675: filtering new block on tags 11661 1726882391.41691: done filtering new block on tags 11661 1726882391.41693: in VariableManager get_vars() 11661 1726882391.41704: done with get_vars() 11661 1726882391.41705: filtering new block on tags 11661 1726882391.41719: done filtering new block on tags 11661 1726882391.41720: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 11661 1726882391.41725: extending task lists for all hosts with included blocks 11661 1726882391.41826: done extending task lists 11661 1726882391.41827: done processing included files 11661 1726882391.41828: results queue empty 11661 1726882391.41828: checking for any_errors_fatal 11661 1726882391.41831: done checking for any_errors_fatal 11661 1726882391.41831: checking for max_fail_percentage 11661 1726882391.41832: done checking for max_fail_percentage 11661 1726882391.41833: checking to see if all hosts have failed and the running result is not ok 11661 1726882391.41833: done checking to see if all hosts have failed 11661 1726882391.41834: getting the remaining hosts for this loop 11661 1726882391.41835: done getting the remaining hosts for this loop 11661 1726882391.41836: getting the next task for host managed_node2 11661 1726882391.41839: done getting next task for host managed_node2 11661 1726882391.41840: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11661 1726882391.41843: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882391.41844: getting variables 11661 1726882391.41845: in VariableManager get_vars() 11661 1726882391.41854: Calling all_inventory to load vars for managed_node2 11661 1726882391.41856: Calling groups_inventory to load vars for managed_node2 11661 1726882391.41857: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882391.41862: Calling all_plugins_play to load vars for managed_node2 11661 1726882391.41865: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882391.41867: Calling groups_plugins_play to load vars for managed_node2 11661 1726882391.45570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882391.46481: done with get_vars() 11661 1726882391.46500: done getting variables 11661 1726882391.46532: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:33:11 -0400 (0:00:00.084) 0:00:20.178 ****** 11661 1726882391.46553: entering _queue_task() for managed_node2/set_fact 11661 1726882391.46791: worker is 1 (out of 1 available) 11661 1726882391.46803: exiting _queue_task() for managed_node2/set_fact 11661 1726882391.46814: done queuing things up, now waiting for results queue to drain 11661 1726882391.46816: waiting for pending results... 11661 1726882391.46998: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 11661 1726882391.47079: in run() - task 0e448fcc-3ce9-896b-2321-0000000003f8 11661 1726882391.47090: variable 'ansible_search_path' from source: unknown 11661 1726882391.47093: variable 'ansible_search_path' from source: unknown 11661 1726882391.47120: calling self._execute() 11661 1726882391.47199: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.47204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.47212: variable 'omit' from source: magic vars 11661 1726882391.47480: variable 'ansible_distribution_major_version' from source: facts 11661 1726882391.47490: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882391.47496: variable 'omit' from source: magic vars 11661 1726882391.47525: variable 'omit' from source: magic vars 11661 1726882391.47548: variable 'omit' from source: magic vars 11661 1726882391.47583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882391.47613: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882391.47629: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882391.47643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882391.47656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882391.47680: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882391.47685: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.47688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.47758: Set connection var ansible_connection to ssh 11661 1726882391.47761: Set connection var ansible_pipelining to False 11661 1726882391.47766: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882391.47774: Set connection var ansible_timeout to 10 11661 1726882391.47777: Set connection var ansible_shell_type to sh 11661 1726882391.47783: Set connection var ansible_shell_executable to /bin/sh 11661 1726882391.47804: variable 'ansible_shell_executable' from source: unknown 11661 1726882391.47809: variable 'ansible_connection' from source: unknown 11661 1726882391.47812: variable 'ansible_module_compression' from source: unknown 11661 1726882391.47816: variable 'ansible_shell_type' from source: unknown 11661 1726882391.47818: variable 'ansible_shell_executable' from source: unknown 11661 1726882391.47821: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.47823: variable 'ansible_pipelining' from source: unknown 11661 1726882391.47825: variable 'ansible_timeout' from source: unknown 11661 1726882391.47827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.47926: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882391.47936: variable 'omit' from source: magic vars 11661 1726882391.47939: starting attempt loop 11661 1726882391.47942: running the handler 11661 1726882391.47955: handler run complete 11661 1726882391.47961: attempt loop complete, returning result 11661 1726882391.47965: _execute() done 11661 1726882391.47968: dumping result to json 11661 1726882391.47970: done dumping result, returning 11661 1726882391.47977: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-896b-2321-0000000003f8] 11661 1726882391.47982: sending task result for task 0e448fcc-3ce9-896b-2321-0000000003f8 11661 1726882391.48071: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000003f8 11661 1726882391.48074: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11661 1726882391.48129: no more pending results, returning what we have 11661 1726882391.48133: results queue empty 11661 1726882391.48134: checking for any_errors_fatal 11661 1726882391.48136: done checking for any_errors_fatal 11661 1726882391.48136: checking for max_fail_percentage 11661 1726882391.48138: done checking for max_fail_percentage 11661 1726882391.48139: checking to see if all hosts have failed and the running result is not ok 11661 1726882391.48139: done checking to see if all hosts have failed 11661 1726882391.48140: getting the remaining hosts for this loop 11661 1726882391.48142: done getting the remaining hosts for this loop 11661 1726882391.48145: getting the next task for host managed_node2 11661 1726882391.48153: done getting next task for host managed_node2 11661 1726882391.48156: ^ task is: TASK: Stat profile file 11661 1726882391.48159: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882391.48165: getting variables 11661 1726882391.48166: in VariableManager get_vars() 11661 1726882391.48200: Calling all_inventory to load vars for managed_node2 11661 1726882391.48203: Calling groups_inventory to load vars for managed_node2 11661 1726882391.48205: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882391.48214: Calling all_plugins_play to load vars for managed_node2 11661 1726882391.48216: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882391.48218: Calling groups_plugins_play to load vars for managed_node2 11661 1726882391.49097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882391.50042: done with get_vars() 11661 1726882391.50059: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:33:11 -0400 (0:00:00.035) 0:00:20.214 ****** 11661 1726882391.50126: entering _queue_task() for managed_node2/stat 11661 1726882391.50348: worker is 1 (out of 1 available) 11661 1726882391.50366: exiting _queue_task() for managed_node2/stat 11661 1726882391.50377: done queuing things up, now waiting for results queue to drain 11661 1726882391.50379: waiting for pending results... 11661 1726882391.50546: running TaskExecutor() for managed_node2/TASK: Stat profile file 11661 1726882391.50614: in run() - task 0e448fcc-3ce9-896b-2321-0000000003f9 11661 1726882391.50631: variable 'ansible_search_path' from source: unknown 11661 1726882391.50635: variable 'ansible_search_path' from source: unknown 11661 1726882391.50668: calling self._execute() 11661 1726882391.50747: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.50752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.50765: variable 'omit' from source: magic vars 11661 1726882391.51038: variable 'ansible_distribution_major_version' from source: facts 11661 1726882391.51050: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882391.51059: variable 'omit' from source: magic vars 11661 1726882391.51092: variable 'omit' from source: magic vars 11661 1726882391.51163: variable 'profile' from source: include params 11661 1726882391.51167: variable 'item' from source: include params 11661 1726882391.51213: variable 'item' from source: include params 11661 1726882391.51226: variable 'omit' from source: magic vars 11661 1726882391.51263: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882391.51294: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882391.51310: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882391.51324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882391.51335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882391.51360: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882391.51368: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.51371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.51435: Set connection var ansible_connection to ssh 11661 1726882391.51439: Set connection var ansible_pipelining to False 11661 1726882391.51445: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882391.51451: Set connection var ansible_timeout to 10 11661 1726882391.51457: Set connection var ansible_shell_type to sh 11661 1726882391.51465: Set connection var ansible_shell_executable to /bin/sh 11661 1726882391.51483: variable 'ansible_shell_executable' from source: unknown 11661 1726882391.51487: variable 'ansible_connection' from source: unknown 11661 1726882391.51489: variable 'ansible_module_compression' from source: unknown 11661 1726882391.51492: variable 'ansible_shell_type' from source: unknown 11661 1726882391.51495: variable 'ansible_shell_executable' from source: unknown 11661 1726882391.51498: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.51502: variable 'ansible_pipelining' from source: unknown 11661 1726882391.51504: variable 'ansible_timeout' from source: unknown 11661 1726882391.51507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.51649: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11661 1726882391.51660: variable 'omit' from source: magic vars 11661 1726882391.51666: starting attempt loop 11661 1726882391.51669: running the handler 11661 1726882391.51680: _low_level_execute_command(): starting 11661 1726882391.51688: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882391.52212: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882391.52221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882391.52250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882391.52266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882391.52279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882391.52327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882391.52333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882391.52346: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882391.52467: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882391.54138: stdout chunk (state=3): >>>/root <<< 11661 1726882391.54240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882391.54293: stderr chunk (state=3): >>><<< 11661 1726882391.54296: stdout chunk (state=3): >>><<< 11661 1726882391.54319: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882391.54329: _low_level_execute_command(): starting 11661 1726882391.54335: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882391.543161-12576-50051093405254 `" && echo ansible-tmp-1726882391.543161-12576-50051093405254="` echo /root/.ansible/tmp/ansible-tmp-1726882391.543161-12576-50051093405254 `" ) && sleep 0' 11661 1726882391.54776: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882391.54783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882391.54814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882391.54835: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882391.54884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882391.54896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882391.55002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882391.56880: stdout chunk (state=3): >>>ansible-tmp-1726882391.543161-12576-50051093405254=/root/.ansible/tmp/ansible-tmp-1726882391.543161-12576-50051093405254 <<< 11661 1726882391.56986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882391.57037: stderr chunk (state=3): >>><<< 11661 1726882391.57041: stdout chunk (state=3): >>><<< 11661 1726882391.57057: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882391.543161-12576-50051093405254=/root/.ansible/tmp/ansible-tmp-1726882391.543161-12576-50051093405254 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882391.57105: variable 'ansible_module_compression' from source: unknown 11661 1726882391.57154: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11661 1726882391.57184: variable 'ansible_facts' from source: unknown 11661 1726882391.57246: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882391.543161-12576-50051093405254/AnsiballZ_stat.py 11661 1726882391.57355: Sending initial data 11661 1726882391.57358: Sent initial data (151 bytes) 11661 1726882391.58036: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882391.58040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882391.58082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882391.58085: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882391.58088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882391.58129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882391.58140: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882391.58249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882391.60000: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882391.60093: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882391.60191: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpk5112at7 /root/.ansible/tmp/ansible-tmp-1726882391.543161-12576-50051093405254/AnsiballZ_stat.py <<< 11661 1726882391.60285: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882391.61323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882391.61440: stderr chunk (state=3): >>><<< 11661 1726882391.61443: stdout chunk (state=3): >>><<< 11661 1726882391.61467: done transferring module to remote 11661 1726882391.61476: _low_level_execute_command(): starting 11661 1726882391.61481: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882391.543161-12576-50051093405254/ /root/.ansible/tmp/ansible-tmp-1726882391.543161-12576-50051093405254/AnsiballZ_stat.py && sleep 0' 11661 1726882391.61943: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882391.61949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882391.61990: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882391.62003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882391.62051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882391.62067: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882391.62183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882391.64002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882391.64054: stderr chunk (state=3): >>><<< 11661 1726882391.64061: stdout chunk (state=3): >>><<< 11661 1726882391.64080: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882391.64084: _low_level_execute_command(): starting 11661 1726882391.64087: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882391.543161-12576-50051093405254/AnsiballZ_stat.py && sleep 0' 11661 1726882391.64541: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882391.64552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882391.64581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882391.64593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882391.64641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882391.64654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882391.64668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882391.64783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882391.77899: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11661 1726882391.78898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882391.78959: stderr chunk (state=3): >>><<< 11661 1726882391.78962: stdout chunk (state=3): >>><<< 11661 1726882391.78983: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882391.79007: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882391.543161-12576-50051093405254/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882391.79015: _low_level_execute_command(): starting 11661 1726882391.79019: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882391.543161-12576-50051093405254/ > /dev/null 2>&1 && sleep 0' 11661 1726882391.79480: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882391.79486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882391.79496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882391.79525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882391.79537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882391.79546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882391.79595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882391.79607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882391.79719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882391.81545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882391.81599: stderr chunk (state=3): >>><<< 11661 1726882391.81604: stdout chunk (state=3): >>><<< 11661 1726882391.81618: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882391.81625: handler run complete 11661 1726882391.81642: attempt loop complete, returning result 11661 1726882391.81645: _execute() done 11661 1726882391.81648: dumping result to json 11661 1726882391.81650: done dumping result, returning 11661 1726882391.81661: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0e448fcc-3ce9-896b-2321-0000000003f9] 11661 1726882391.81666: sending task result for task 0e448fcc-3ce9-896b-2321-0000000003f9 11661 1726882391.81759: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000003f9 11661 1726882391.81762: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 11661 1726882391.81817: no more pending results, returning what we have 11661 1726882391.81820: results queue empty 11661 1726882391.81821: checking for any_errors_fatal 11661 1726882391.81828: done checking for any_errors_fatal 11661 1726882391.81829: checking for max_fail_percentage 11661 1726882391.81830: done checking for max_fail_percentage 11661 1726882391.81831: checking to see if all hosts have failed and the running result is not ok 11661 1726882391.81832: done checking to see if all hosts have failed 11661 1726882391.81832: getting the remaining hosts for this loop 11661 1726882391.81834: done getting the remaining hosts for this loop 11661 1726882391.81837: getting the next task for host managed_node2 11661 1726882391.81844: done getting next task for host managed_node2 11661 1726882391.81846: ^ task is: TASK: Set NM profile exist flag based on the profile files 11661 1726882391.81849: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882391.81854: getting variables 11661 1726882391.81856: in VariableManager get_vars() 11661 1726882391.81900: Calling all_inventory to load vars for managed_node2 11661 1726882391.81903: Calling groups_inventory to load vars for managed_node2 11661 1726882391.81905: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882391.81916: Calling all_plugins_play to load vars for managed_node2 11661 1726882391.81918: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882391.81921: Calling groups_plugins_play to load vars for managed_node2 11661 1726882391.82744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882391.83728: done with get_vars() 11661 1726882391.83742: done getting variables 11661 1726882391.83793: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:33:11 -0400 (0:00:00.336) 0:00:20.551 ****** 11661 1726882391.83815: entering _queue_task() for managed_node2/set_fact 11661 1726882391.84025: worker is 1 (out of 1 available) 11661 1726882391.84039: exiting _queue_task() for managed_node2/set_fact 11661 1726882391.84053: done queuing things up, now waiting for results queue to drain 11661 1726882391.84055: waiting for pending results... 11661 1726882391.84227: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 11661 1726882391.84302: in run() - task 0e448fcc-3ce9-896b-2321-0000000003fa 11661 1726882391.84312: variable 'ansible_search_path' from source: unknown 11661 1726882391.84316: variable 'ansible_search_path' from source: unknown 11661 1726882391.84345: calling self._execute() 11661 1726882391.84418: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.84422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.84431: variable 'omit' from source: magic vars 11661 1726882391.84707: variable 'ansible_distribution_major_version' from source: facts 11661 1726882391.84717: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882391.84801: variable 'profile_stat' from source: set_fact 11661 1726882391.84814: Evaluated conditional (profile_stat.stat.exists): False 11661 1726882391.84818: when evaluation is False, skipping this task 11661 1726882391.84822: _execute() done 11661 1726882391.84825: dumping result to json 11661 1726882391.84827: done dumping result, returning 11661 1726882391.84830: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-896b-2321-0000000003fa] 11661 1726882391.84834: sending task result for task 0e448fcc-3ce9-896b-2321-0000000003fa 11661 1726882391.84915: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000003fa 11661 1726882391.84919: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11661 1726882391.84990: no more pending results, returning what we have 11661 1726882391.84993: results queue empty 11661 1726882391.84993: checking for any_errors_fatal 11661 1726882391.84999: done checking for any_errors_fatal 11661 1726882391.85000: checking for max_fail_percentage 11661 1726882391.85001: done checking for max_fail_percentage 11661 1726882391.85002: checking to see if all hosts have failed and the running result is not ok 11661 1726882391.85003: done checking to see if all hosts have failed 11661 1726882391.85003: getting the remaining hosts for this loop 11661 1726882391.85004: done getting the remaining hosts for this loop 11661 1726882391.85007: getting the next task for host managed_node2 11661 1726882391.85012: done getting next task for host managed_node2 11661 1726882391.85015: ^ task is: TASK: Get NM profile info 11661 1726882391.85018: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882391.85022: getting variables 11661 1726882391.85023: in VariableManager get_vars() 11661 1726882391.85064: Calling all_inventory to load vars for managed_node2 11661 1726882391.85067: Calling groups_inventory to load vars for managed_node2 11661 1726882391.85068: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882391.85075: Calling all_plugins_play to load vars for managed_node2 11661 1726882391.85077: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882391.85079: Calling groups_plugins_play to load vars for managed_node2 11661 1726882391.85999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882391.86940: done with get_vars() 11661 1726882391.86958: done getting variables 11661 1726882391.87003: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:33:11 -0400 (0:00:00.032) 0:00:20.583 ****** 11661 1726882391.87024: entering _queue_task() for managed_node2/shell 11661 1726882391.87230: worker is 1 (out of 1 available) 11661 1726882391.87243: exiting _queue_task() for managed_node2/shell 11661 1726882391.87257: done queuing things up, now waiting for results queue to drain 11661 1726882391.87259: waiting for pending results... 11661 1726882391.87429: running TaskExecutor() for managed_node2/TASK: Get NM profile info 11661 1726882391.87531: in run() - task 0e448fcc-3ce9-896b-2321-0000000003fb 11661 1726882391.87536: variable 'ansible_search_path' from source: unknown 11661 1726882391.87541: variable 'ansible_search_path' from source: unknown 11661 1726882391.87570: calling self._execute() 11661 1726882391.87642: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.87646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.87660: variable 'omit' from source: magic vars 11661 1726882391.87950: variable 'ansible_distribution_major_version' from source: facts 11661 1726882391.87962: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882391.87969: variable 'omit' from source: magic vars 11661 1726882391.88007: variable 'omit' from source: magic vars 11661 1726882391.88081: variable 'profile' from source: include params 11661 1726882391.88084: variable 'item' from source: include params 11661 1726882391.88132: variable 'item' from source: include params 11661 1726882391.88145: variable 'omit' from source: magic vars 11661 1726882391.88184: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882391.88213: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882391.88229: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882391.88242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882391.88252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882391.88285: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882391.88288: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.88291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.88365: Set connection var ansible_connection to ssh 11661 1726882391.88368: Set connection var ansible_pipelining to False 11661 1726882391.88375: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882391.88383: Set connection var ansible_timeout to 10 11661 1726882391.88386: Set connection var ansible_shell_type to sh 11661 1726882391.88392: Set connection var ansible_shell_executable to /bin/sh 11661 1726882391.88408: variable 'ansible_shell_executable' from source: unknown 11661 1726882391.88411: variable 'ansible_connection' from source: unknown 11661 1726882391.88414: variable 'ansible_module_compression' from source: unknown 11661 1726882391.88416: variable 'ansible_shell_type' from source: unknown 11661 1726882391.88420: variable 'ansible_shell_executable' from source: unknown 11661 1726882391.88425: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882391.88429: variable 'ansible_pipelining' from source: unknown 11661 1726882391.88431: variable 'ansible_timeout' from source: unknown 11661 1726882391.88435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882391.88536: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882391.88542: variable 'omit' from source: magic vars 11661 1726882391.88547: starting attempt loop 11661 1726882391.88553: running the handler 11661 1726882391.88559: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882391.88577: _low_level_execute_command(): starting 11661 1726882391.88583: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882391.89114: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882391.89124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882391.89157: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882391.89174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882391.89225: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882391.89231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882391.89241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882391.89382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882391.91016: stdout chunk (state=3): >>>/root <<< 11661 1726882391.91166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882391.91193: stderr chunk (state=3): >>><<< 11661 1726882391.91199: stdout chunk (state=3): >>><<< 11661 1726882391.91222: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882391.91238: _low_level_execute_command(): starting 11661 1726882391.91250: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882391.912277-12585-111306736822588 `" && echo ansible-tmp-1726882391.912277-12585-111306736822588="` echo /root/.ansible/tmp/ansible-tmp-1726882391.912277-12585-111306736822588 `" ) && sleep 0' 11661 1726882391.91726: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882391.91748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882391.91761: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882391.91777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 11661 1726882391.91795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882391.91834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882391.91840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882391.91854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882391.91990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882391.93969: stdout chunk (state=3): >>>ansible-tmp-1726882391.912277-12585-111306736822588=/root/.ansible/tmp/ansible-tmp-1726882391.912277-12585-111306736822588 <<< 11661 1726882391.94257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882391.94276: stdout chunk (state=3): >>><<< 11661 1726882391.94288: stderr chunk (state=3): >>><<< 11661 1726882391.94311: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882391.912277-12585-111306736822588=/root/.ansible/tmp/ansible-tmp-1726882391.912277-12585-111306736822588 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882391.94349: variable 'ansible_module_compression' from source: unknown 11661 1726882391.94420: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11661 1726882391.94466: variable 'ansible_facts' from source: unknown 11661 1726882391.94570: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882391.912277-12585-111306736822588/AnsiballZ_command.py 11661 1726882391.94748: Sending initial data 11661 1726882391.94755: Sent initial data (155 bytes) 11661 1726882391.95831: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882391.95849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882391.95873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882391.95893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882391.95947: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882391.95967: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882391.95981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882391.96000: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882391.96017: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882391.96036: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882391.96053: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882391.96071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882391.96088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882391.96102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882391.96115: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882391.96134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882391.96217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882391.96247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882391.96274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882391.96460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882391.98225: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882391.98313: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882391.98411: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpp1doqppo /root/.ansible/tmp/ansible-tmp-1726882391.912277-12585-111306736822588/AnsiballZ_command.py <<< 11661 1726882391.98503: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882391.99959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882392.00100: stderr chunk (state=3): >>><<< 11661 1726882392.00104: stdout chunk (state=3): >>><<< 11661 1726882392.00106: done transferring module to remote 11661 1726882392.00109: _low_level_execute_command(): starting 11661 1726882392.00111: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882391.912277-12585-111306736822588/ /root/.ansible/tmp/ansible-tmp-1726882391.912277-12585-111306736822588/AnsiballZ_command.py && sleep 0' 11661 1726882392.00778: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882392.00792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882392.00808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882392.00827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882392.00888: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882392.00901: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882392.00915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882392.00934: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882392.00946: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882392.00963: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882392.00985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882392.01001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882392.01017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882392.01030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882392.01042: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882392.01061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882392.01145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882392.01167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882392.01183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882392.01548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882392.03135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882392.03233: stderr chunk (state=3): >>><<< 11661 1726882392.03243: stdout chunk (state=3): >>><<< 11661 1726882392.03350: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882392.03354: _low_level_execute_command(): starting 11661 1726882392.03357: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882391.912277-12585-111306736822588/AnsiballZ_command.py && sleep 0' 11661 1726882392.03947: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882392.03963: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882392.03982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882392.04001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882392.04052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882392.04069: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882392.04085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882392.04103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882392.04115: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882392.04134: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882392.04146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882392.04160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882392.04180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882392.04192: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882392.04203: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882392.04217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882392.04304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882392.04327: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882392.04349: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882392.04501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882392.19934: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 21:33:12.174465", "end": "2024-09-20 21:33:12.197013", "delta": "0:00:00.022548", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11661 1726882392.21188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882392.21257: stderr chunk (state=3): >>><<< 11661 1726882392.21260: stdout chunk (state=3): >>><<< 11661 1726882392.21288: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 21:33:12.174465", "end": "2024-09-20 21:33:12.197013", "delta": "0:00:00.022548", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882392.21321: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882391.912277-12585-111306736822588/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882392.21336: _low_level_execute_command(): starting 11661 1726882392.21339: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882391.912277-12585-111306736822588/ > /dev/null 2>&1 && sleep 0' 11661 1726882392.22017: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882392.22020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882392.22023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882392.22025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882392.22028: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882392.22030: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882392.22032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882392.22034: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882392.22036: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882392.22038: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882392.22040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882392.22047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882392.22204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882392.22207: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882392.22210: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882392.22215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882392.22217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882392.22219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882392.22221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882392.22319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882392.24117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882392.24158: stderr chunk (state=3): >>><<< 11661 1726882392.24161: stdout chunk (state=3): >>><<< 11661 1726882392.24177: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882392.24184: handler run complete 11661 1726882392.24204: Evaluated conditional (False): False 11661 1726882392.24212: attempt loop complete, returning result 11661 1726882392.24215: _execute() done 11661 1726882392.24217: dumping result to json 11661 1726882392.24222: done dumping result, returning 11661 1726882392.24229: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0e448fcc-3ce9-896b-2321-0000000003fb] 11661 1726882392.24234: sending task result for task 0e448fcc-3ce9-896b-2321-0000000003fb 11661 1726882392.24329: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000003fb 11661 1726882392.24332: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.022548", "end": "2024-09-20 21:33:12.197013", "rc": 0, "start": "2024-09-20 21:33:12.174465" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 11661 1726882392.24402: no more pending results, returning what we have 11661 1726882392.24405: results queue empty 11661 1726882392.24406: checking for any_errors_fatal 11661 1726882392.24412: done checking for any_errors_fatal 11661 1726882392.24413: checking for max_fail_percentage 11661 1726882392.24414: done checking for max_fail_percentage 11661 1726882392.24415: checking to see if all hosts have failed and the running result is not ok 11661 1726882392.24416: done checking to see if all hosts have failed 11661 1726882392.24416: getting the remaining hosts for this loop 11661 1726882392.24418: done getting the remaining hosts for this loop 11661 1726882392.24421: getting the next task for host managed_node2 11661 1726882392.24429: done getting next task for host managed_node2 11661 1726882392.24431: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11661 1726882392.24435: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882392.24438: getting variables 11661 1726882392.24440: in VariableManager get_vars() 11661 1726882392.24481: Calling all_inventory to load vars for managed_node2 11661 1726882392.24483: Calling groups_inventory to load vars for managed_node2 11661 1726882392.24485: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882392.24495: Calling all_plugins_play to load vars for managed_node2 11661 1726882392.24497: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882392.24500: Calling groups_plugins_play to load vars for managed_node2 11661 1726882392.25513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882392.26945: done with get_vars() 11661 1726882392.26967: done getting variables 11661 1726882392.27011: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:33:12 -0400 (0:00:00.400) 0:00:20.983 ****** 11661 1726882392.27032: entering _queue_task() for managed_node2/set_fact 11661 1726882392.27263: worker is 1 (out of 1 available) 11661 1726882392.27277: exiting _queue_task() for managed_node2/set_fact 11661 1726882392.27291: done queuing things up, now waiting for results queue to drain 11661 1726882392.27293: waiting for pending results... 11661 1726882392.27469: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11661 1726882392.27542: in run() - task 0e448fcc-3ce9-896b-2321-0000000003fc 11661 1726882392.27555: variable 'ansible_search_path' from source: unknown 11661 1726882392.27559: variable 'ansible_search_path' from source: unknown 11661 1726882392.27589: calling self._execute() 11661 1726882392.27666: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.27670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.27680: variable 'omit' from source: magic vars 11661 1726882392.27948: variable 'ansible_distribution_major_version' from source: facts 11661 1726882392.27959: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882392.28048: variable 'nm_profile_exists' from source: set_fact 11661 1726882392.28061: Evaluated conditional (nm_profile_exists.rc == 0): True 11661 1726882392.28065: variable 'omit' from source: magic vars 11661 1726882392.28098: variable 'omit' from source: magic vars 11661 1726882392.28119: variable 'omit' from source: magic vars 11661 1726882392.28157: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882392.28184: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882392.28206: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882392.28219: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882392.28229: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882392.28256: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882392.28259: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.28262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.28328: Set connection var ansible_connection to ssh 11661 1726882392.28331: Set connection var ansible_pipelining to False 11661 1726882392.28337: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882392.28347: Set connection var ansible_timeout to 10 11661 1726882392.28353: Set connection var ansible_shell_type to sh 11661 1726882392.28356: Set connection var ansible_shell_executable to /bin/sh 11661 1726882392.28374: variable 'ansible_shell_executable' from source: unknown 11661 1726882392.28377: variable 'ansible_connection' from source: unknown 11661 1726882392.28379: variable 'ansible_module_compression' from source: unknown 11661 1726882392.28381: variable 'ansible_shell_type' from source: unknown 11661 1726882392.28384: variable 'ansible_shell_executable' from source: unknown 11661 1726882392.28386: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.28388: variable 'ansible_pipelining' from source: unknown 11661 1726882392.28391: variable 'ansible_timeout' from source: unknown 11661 1726882392.28394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.28493: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882392.28500: variable 'omit' from source: magic vars 11661 1726882392.28505: starting attempt loop 11661 1726882392.28507: running the handler 11661 1726882392.28519: handler run complete 11661 1726882392.28527: attempt loop complete, returning result 11661 1726882392.28530: _execute() done 11661 1726882392.28532: dumping result to json 11661 1726882392.28534: done dumping result, returning 11661 1726882392.28541: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-896b-2321-0000000003fc] 11661 1726882392.28547: sending task result for task 0e448fcc-3ce9-896b-2321-0000000003fc 11661 1726882392.28629: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000003fc 11661 1726882392.28632: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11661 1726882392.28693: no more pending results, returning what we have 11661 1726882392.28697: results queue empty 11661 1726882392.28697: checking for any_errors_fatal 11661 1726882392.28705: done checking for any_errors_fatal 11661 1726882392.28706: checking for max_fail_percentage 11661 1726882392.28707: done checking for max_fail_percentage 11661 1726882392.28708: checking to see if all hosts have failed and the running result is not ok 11661 1726882392.28709: done checking to see if all hosts have failed 11661 1726882392.28709: getting the remaining hosts for this loop 11661 1726882392.28711: done getting the remaining hosts for this loop 11661 1726882392.28714: getting the next task for host managed_node2 11661 1726882392.28722: done getting next task for host managed_node2 11661 1726882392.28724: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11661 1726882392.28728: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882392.28731: getting variables 11661 1726882392.28732: in VariableManager get_vars() 11661 1726882392.28773: Calling all_inventory to load vars for managed_node2 11661 1726882392.28776: Calling groups_inventory to load vars for managed_node2 11661 1726882392.28778: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882392.28792: Calling all_plugins_play to load vars for managed_node2 11661 1726882392.28798: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882392.28801: Calling groups_plugins_play to load vars for managed_node2 11661 1726882392.29603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882392.30550: done with get_vars() 11661 1726882392.30569: done getting variables 11661 1726882392.30611: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882392.30698: variable 'profile' from source: include params 11661 1726882392.30701: variable 'item' from source: include params 11661 1726882392.30741: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:33:12 -0400 (0:00:00.037) 0:00:21.021 ****** 11661 1726882392.30776: entering _queue_task() for managed_node2/command 11661 1726882392.31000: worker is 1 (out of 1 available) 11661 1726882392.31015: exiting _queue_task() for managed_node2/command 11661 1726882392.31026: done queuing things up, now waiting for results queue to drain 11661 1726882392.31028: waiting for pending results... 11661 1726882392.31206: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0.0 11661 1726882392.31282: in run() - task 0e448fcc-3ce9-896b-2321-0000000003fe 11661 1726882392.31296: variable 'ansible_search_path' from source: unknown 11661 1726882392.31299: variable 'ansible_search_path' from source: unknown 11661 1726882392.31327: calling self._execute() 11661 1726882392.31408: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.31412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.31421: variable 'omit' from source: magic vars 11661 1726882392.31692: variable 'ansible_distribution_major_version' from source: facts 11661 1726882392.31702: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882392.31788: variable 'profile_stat' from source: set_fact 11661 1726882392.31799: Evaluated conditional (profile_stat.stat.exists): False 11661 1726882392.31803: when evaluation is False, skipping this task 11661 1726882392.31805: _execute() done 11661 1726882392.31808: dumping result to json 11661 1726882392.31810: done dumping result, returning 11661 1726882392.31818: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [0e448fcc-3ce9-896b-2321-0000000003fe] 11661 1726882392.31823: sending task result for task 0e448fcc-3ce9-896b-2321-0000000003fe 11661 1726882392.31907: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000003fe 11661 1726882392.31910: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11661 1726882392.31966: no more pending results, returning what we have 11661 1726882392.31970: results queue empty 11661 1726882392.31970: checking for any_errors_fatal 11661 1726882392.31976: done checking for any_errors_fatal 11661 1726882392.31977: checking for max_fail_percentage 11661 1726882392.31979: done checking for max_fail_percentage 11661 1726882392.31979: checking to see if all hosts have failed and the running result is not ok 11661 1726882392.31980: done checking to see if all hosts have failed 11661 1726882392.31981: getting the remaining hosts for this loop 11661 1726882392.31982: done getting the remaining hosts for this loop 11661 1726882392.31985: getting the next task for host managed_node2 11661 1726882392.31992: done getting next task for host managed_node2 11661 1726882392.31995: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11661 1726882392.31998: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882392.32002: getting variables 11661 1726882392.32003: in VariableManager get_vars() 11661 1726882392.32042: Calling all_inventory to load vars for managed_node2 11661 1726882392.32049: Calling groups_inventory to load vars for managed_node2 11661 1726882392.32051: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882392.32061: Calling all_plugins_play to load vars for managed_node2 11661 1726882392.32065: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882392.32069: Calling groups_plugins_play to load vars for managed_node2 11661 1726882392.32998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882392.33922: done with get_vars() 11661 1726882392.33938: done getting variables 11661 1726882392.33983: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882392.34066: variable 'profile' from source: include params 11661 1726882392.34068: variable 'item' from source: include params 11661 1726882392.34110: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:33:12 -0400 (0:00:00.033) 0:00:21.054 ****** 11661 1726882392.34132: entering _queue_task() for managed_node2/set_fact 11661 1726882392.34358: worker is 1 (out of 1 available) 11661 1726882392.34374: exiting _queue_task() for managed_node2/set_fact 11661 1726882392.34387: done queuing things up, now waiting for results queue to drain 11661 1726882392.34388: waiting for pending results... 11661 1726882392.34561: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 11661 1726882392.34637: in run() - task 0e448fcc-3ce9-896b-2321-0000000003ff 11661 1726882392.34651: variable 'ansible_search_path' from source: unknown 11661 1726882392.34657: variable 'ansible_search_path' from source: unknown 11661 1726882392.34685: calling self._execute() 11661 1726882392.34761: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.34766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.34776: variable 'omit' from source: magic vars 11661 1726882392.35031: variable 'ansible_distribution_major_version' from source: facts 11661 1726882392.35040: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882392.35128: variable 'profile_stat' from source: set_fact 11661 1726882392.35138: Evaluated conditional (profile_stat.stat.exists): False 11661 1726882392.35141: when evaluation is False, skipping this task 11661 1726882392.35144: _execute() done 11661 1726882392.35146: dumping result to json 11661 1726882392.35149: done dumping result, returning 11661 1726882392.35160: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [0e448fcc-3ce9-896b-2321-0000000003ff] 11661 1726882392.35165: sending task result for task 0e448fcc-3ce9-896b-2321-0000000003ff 11661 1726882392.35248: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000003ff 11661 1726882392.35250: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11661 1726882392.35309: no more pending results, returning what we have 11661 1726882392.35313: results queue empty 11661 1726882392.35314: checking for any_errors_fatal 11661 1726882392.35323: done checking for any_errors_fatal 11661 1726882392.35324: checking for max_fail_percentage 11661 1726882392.35325: done checking for max_fail_percentage 11661 1726882392.35326: checking to see if all hosts have failed and the running result is not ok 11661 1726882392.35327: done checking to see if all hosts have failed 11661 1726882392.35327: getting the remaining hosts for this loop 11661 1726882392.35329: done getting the remaining hosts for this loop 11661 1726882392.35332: getting the next task for host managed_node2 11661 1726882392.35338: done getting next task for host managed_node2 11661 1726882392.35341: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11661 1726882392.35344: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882392.35348: getting variables 11661 1726882392.35349: in VariableManager get_vars() 11661 1726882392.35382: Calling all_inventory to load vars for managed_node2 11661 1726882392.35385: Calling groups_inventory to load vars for managed_node2 11661 1726882392.35387: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882392.35396: Calling all_plugins_play to load vars for managed_node2 11661 1726882392.35404: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882392.35408: Calling groups_plugins_play to load vars for managed_node2 11661 1726882392.36187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882392.37219: done with get_vars() 11661 1726882392.37234: done getting variables 11661 1726882392.37282: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882392.37360: variable 'profile' from source: include params 11661 1726882392.37363: variable 'item' from source: include params 11661 1726882392.37403: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:33:12 -0400 (0:00:00.032) 0:00:21.087 ****** 11661 1726882392.37425: entering _queue_task() for managed_node2/command 11661 1726882392.37648: worker is 1 (out of 1 available) 11661 1726882392.37665: exiting _queue_task() for managed_node2/command 11661 1726882392.37678: done queuing things up, now waiting for results queue to drain 11661 1726882392.37679: waiting for pending results... 11661 1726882392.37854: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0.0 11661 1726882392.37930: in run() - task 0e448fcc-3ce9-896b-2321-000000000400 11661 1726882392.37942: variable 'ansible_search_path' from source: unknown 11661 1726882392.37945: variable 'ansible_search_path' from source: unknown 11661 1726882392.37977: calling self._execute() 11661 1726882392.38051: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.38059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.38070: variable 'omit' from source: magic vars 11661 1726882392.38334: variable 'ansible_distribution_major_version' from source: facts 11661 1726882392.38353: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882392.38433: variable 'profile_stat' from source: set_fact 11661 1726882392.38453: Evaluated conditional (profile_stat.stat.exists): False 11661 1726882392.38457: when evaluation is False, skipping this task 11661 1726882392.38460: _execute() done 11661 1726882392.38462: dumping result to json 11661 1726882392.38467: done dumping result, returning 11661 1726882392.38469: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0.0 [0e448fcc-3ce9-896b-2321-000000000400] 11661 1726882392.38474: sending task result for task 0e448fcc-3ce9-896b-2321-000000000400 11661 1726882392.38556: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000400 11661 1726882392.38559: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11661 1726882392.38631: no more pending results, returning what we have 11661 1726882392.38635: results queue empty 11661 1726882392.38636: checking for any_errors_fatal 11661 1726882392.38641: done checking for any_errors_fatal 11661 1726882392.38642: checking for max_fail_percentage 11661 1726882392.38644: done checking for max_fail_percentage 11661 1726882392.38644: checking to see if all hosts have failed and the running result is not ok 11661 1726882392.38645: done checking to see if all hosts have failed 11661 1726882392.38646: getting the remaining hosts for this loop 11661 1726882392.38647: done getting the remaining hosts for this loop 11661 1726882392.38650: getting the next task for host managed_node2 11661 1726882392.38656: done getting next task for host managed_node2 11661 1726882392.38658: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11661 1726882392.38667: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882392.38671: getting variables 11661 1726882392.38672: in VariableManager get_vars() 11661 1726882392.38705: Calling all_inventory to load vars for managed_node2 11661 1726882392.38707: Calling groups_inventory to load vars for managed_node2 11661 1726882392.38710: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882392.38719: Calling all_plugins_play to load vars for managed_node2 11661 1726882392.38722: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882392.38724: Calling groups_plugins_play to load vars for managed_node2 11661 1726882392.39546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882392.40653: done with get_vars() 11661 1726882392.40680: done getting variables 11661 1726882392.40741: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882392.40857: variable 'profile' from source: include params 11661 1726882392.40861: variable 'item' from source: include params 11661 1726882392.40920: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:33:12 -0400 (0:00:00.035) 0:00:21.122 ****** 11661 1726882392.40951: entering _queue_task() for managed_node2/set_fact 11661 1726882392.41284: worker is 1 (out of 1 available) 11661 1726882392.41296: exiting _queue_task() for managed_node2/set_fact 11661 1726882392.41308: done queuing things up, now waiting for results queue to drain 11661 1726882392.41310: waiting for pending results... 11661 1726882392.41678: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0.0 11661 1726882392.41777: in run() - task 0e448fcc-3ce9-896b-2321-000000000401 11661 1726882392.41787: variable 'ansible_search_path' from source: unknown 11661 1726882392.41791: variable 'ansible_search_path' from source: unknown 11661 1726882392.41820: calling self._execute() 11661 1726882392.41908: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.41912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.41921: variable 'omit' from source: magic vars 11661 1726882392.42202: variable 'ansible_distribution_major_version' from source: facts 11661 1726882392.42213: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882392.42301: variable 'profile_stat' from source: set_fact 11661 1726882392.42312: Evaluated conditional (profile_stat.stat.exists): False 11661 1726882392.42315: when evaluation is False, skipping this task 11661 1726882392.42318: _execute() done 11661 1726882392.42320: dumping result to json 11661 1726882392.42322: done dumping result, returning 11661 1726882392.42329: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [0e448fcc-3ce9-896b-2321-000000000401] 11661 1726882392.42334: sending task result for task 0e448fcc-3ce9-896b-2321-000000000401 11661 1726882392.42423: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000401 11661 1726882392.42426: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11661 1726882392.42474: no more pending results, returning what we have 11661 1726882392.42478: results queue empty 11661 1726882392.42479: checking for any_errors_fatal 11661 1726882392.42485: done checking for any_errors_fatal 11661 1726882392.42486: checking for max_fail_percentage 11661 1726882392.42488: done checking for max_fail_percentage 11661 1726882392.42489: checking to see if all hosts have failed and the running result is not ok 11661 1726882392.42489: done checking to see if all hosts have failed 11661 1726882392.42490: getting the remaining hosts for this loop 11661 1726882392.42492: done getting the remaining hosts for this loop 11661 1726882392.42495: getting the next task for host managed_node2 11661 1726882392.42504: done getting next task for host managed_node2 11661 1726882392.42507: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11661 1726882392.42509: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882392.42515: getting variables 11661 1726882392.42516: in VariableManager get_vars() 11661 1726882392.42557: Calling all_inventory to load vars for managed_node2 11661 1726882392.42560: Calling groups_inventory to load vars for managed_node2 11661 1726882392.42562: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882392.42574: Calling all_plugins_play to load vars for managed_node2 11661 1726882392.42576: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882392.42578: Calling groups_plugins_play to load vars for managed_node2 11661 1726882392.43571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882392.45325: done with get_vars() 11661 1726882392.45353: done getting variables 11661 1726882392.45417: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882392.45535: variable 'profile' from source: include params 11661 1726882392.45539: variable 'item' from source: include params 11661 1726882392.45599: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:33:12 -0400 (0:00:00.046) 0:00:21.169 ****** 11661 1726882392.45630: entering _queue_task() for managed_node2/assert 11661 1726882392.45950: worker is 1 (out of 1 available) 11661 1726882392.45962: exiting _queue_task() for managed_node2/assert 11661 1726882392.45975: done queuing things up, now waiting for results queue to drain 11661 1726882392.45977: waiting for pending results... 11661 1726882392.46249: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0.0' 11661 1726882392.46487: in run() - task 0e448fcc-3ce9-896b-2321-000000000267 11661 1726882392.46505: variable 'ansible_search_path' from source: unknown 11661 1726882392.46512: variable 'ansible_search_path' from source: unknown 11661 1726882392.46554: calling self._execute() 11661 1726882392.46659: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.46673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.46687: variable 'omit' from source: magic vars 11661 1726882392.47041: variable 'ansible_distribution_major_version' from source: facts 11661 1726882392.47058: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882392.47072: variable 'omit' from source: magic vars 11661 1726882392.47116: variable 'omit' from source: magic vars 11661 1726882392.47217: variable 'profile' from source: include params 11661 1726882392.47226: variable 'item' from source: include params 11661 1726882392.47293: variable 'item' from source: include params 11661 1726882392.47317: variable 'omit' from source: magic vars 11661 1726882392.47362: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882392.47405: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882392.47430: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882392.47450: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882392.47469: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882392.47500: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882392.47508: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.47519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.47616: Set connection var ansible_connection to ssh 11661 1726882392.48391: Set connection var ansible_pipelining to False 11661 1726882392.48403: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882392.48414: Set connection var ansible_timeout to 10 11661 1726882392.48420: Set connection var ansible_shell_type to sh 11661 1726882392.48431: Set connection var ansible_shell_executable to /bin/sh 11661 1726882392.48519: variable 'ansible_shell_executable' from source: unknown 11661 1726882392.48527: variable 'ansible_connection' from source: unknown 11661 1726882392.48534: variable 'ansible_module_compression' from source: unknown 11661 1726882392.48540: variable 'ansible_shell_type' from source: unknown 11661 1726882392.48546: variable 'ansible_shell_executable' from source: unknown 11661 1726882392.48551: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.48558: variable 'ansible_pipelining' from source: unknown 11661 1726882392.48567: variable 'ansible_timeout' from source: unknown 11661 1726882392.48575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.48748: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882392.48767: variable 'omit' from source: magic vars 11661 1726882392.48776: starting attempt loop 11661 1726882392.48783: running the handler 11661 1726882392.48898: variable 'lsr_net_profile_exists' from source: set_fact 11661 1726882392.48908: Evaluated conditional (lsr_net_profile_exists): True 11661 1726882392.48917: handler run complete 11661 1726882392.48940: attempt loop complete, returning result 11661 1726882392.48948: _execute() done 11661 1726882392.48954: dumping result to json 11661 1726882392.48960: done dumping result, returning 11661 1726882392.48973: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0.0' [0e448fcc-3ce9-896b-2321-000000000267] 11661 1726882392.48982: sending task result for task 0e448fcc-3ce9-896b-2321-000000000267 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11661 1726882392.49124: no more pending results, returning what we have 11661 1726882392.49128: results queue empty 11661 1726882392.49128: checking for any_errors_fatal 11661 1726882392.49136: done checking for any_errors_fatal 11661 1726882392.49136: checking for max_fail_percentage 11661 1726882392.49139: done checking for max_fail_percentage 11661 1726882392.49139: checking to see if all hosts have failed and the running result is not ok 11661 1726882392.49141: done checking to see if all hosts have failed 11661 1726882392.49142: getting the remaining hosts for this loop 11661 1726882392.49144: done getting the remaining hosts for this loop 11661 1726882392.49147: getting the next task for host managed_node2 11661 1726882392.49155: done getting next task for host managed_node2 11661 1726882392.49158: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11661 1726882392.49161: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882392.49166: getting variables 11661 1726882392.49168: in VariableManager get_vars() 11661 1726882392.49212: Calling all_inventory to load vars for managed_node2 11661 1726882392.49215: Calling groups_inventory to load vars for managed_node2 11661 1726882392.49218: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882392.49229: Calling all_plugins_play to load vars for managed_node2 11661 1726882392.49233: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882392.49236: Calling groups_plugins_play to load vars for managed_node2 11661 1726882392.50796: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000267 11661 1726882392.50799: WORKER PROCESS EXITING 11661 1726882392.51150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882392.52834: done with get_vars() 11661 1726882392.52861: done getting variables 11661 1726882392.52921: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882392.53035: variable 'profile' from source: include params 11661 1726882392.53038: variable 'item' from source: include params 11661 1726882392.53098: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:33:12 -0400 (0:00:00.074) 0:00:21.244 ****** 11661 1726882392.53132: entering _queue_task() for managed_node2/assert 11661 1726882392.53707: worker is 1 (out of 1 available) 11661 1726882392.53719: exiting _queue_task() for managed_node2/assert 11661 1726882392.53731: done queuing things up, now waiting for results queue to drain 11661 1726882392.53732: waiting for pending results... 11661 1726882392.54728: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0.0' 11661 1726882392.54886: in run() - task 0e448fcc-3ce9-896b-2321-000000000268 11661 1726882392.55934: variable 'ansible_search_path' from source: unknown 11661 1726882392.56139: variable 'ansible_search_path' from source: unknown 11661 1726882392.56185: calling self._execute() 11661 1726882392.56410: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.56486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.56490: variable 'omit' from source: magic vars 11661 1726882392.56877: variable 'ansible_distribution_major_version' from source: facts 11661 1726882392.56887: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882392.56894: variable 'omit' from source: magic vars 11661 1726882392.56938: variable 'omit' from source: magic vars 11661 1726882392.57038: variable 'profile' from source: include params 11661 1726882392.57047: variable 'item' from source: include params 11661 1726882392.57109: variable 'item' from source: include params 11661 1726882392.57128: variable 'omit' from source: magic vars 11661 1726882392.57174: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882392.57211: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882392.57230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882392.57246: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882392.57258: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882392.57292: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882392.57296: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.57299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.57390: Set connection var ansible_connection to ssh 11661 1726882392.57395: Set connection var ansible_pipelining to False 11661 1726882392.57401: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882392.57408: Set connection var ansible_timeout to 10 11661 1726882392.57411: Set connection var ansible_shell_type to sh 11661 1726882392.57418: Set connection var ansible_shell_executable to /bin/sh 11661 1726882392.57442: variable 'ansible_shell_executable' from source: unknown 11661 1726882392.57446: variable 'ansible_connection' from source: unknown 11661 1726882392.57448: variable 'ansible_module_compression' from source: unknown 11661 1726882392.57453: variable 'ansible_shell_type' from source: unknown 11661 1726882392.57456: variable 'ansible_shell_executable' from source: unknown 11661 1726882392.57458: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.57461: variable 'ansible_pipelining' from source: unknown 11661 1726882392.57466: variable 'ansible_timeout' from source: unknown 11661 1726882392.57469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.57618: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882392.57629: variable 'omit' from source: magic vars 11661 1726882392.57634: starting attempt loop 11661 1726882392.57637: running the handler 11661 1726882392.57753: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11661 1726882392.57757: Evaluated conditional (lsr_net_profile_ansible_managed): True 11661 1726882392.57761: handler run complete 11661 1726882392.57780: attempt loop complete, returning result 11661 1726882392.57783: _execute() done 11661 1726882392.57786: dumping result to json 11661 1726882392.57788: done dumping result, returning 11661 1726882392.57795: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0.0' [0e448fcc-3ce9-896b-2321-000000000268] 11661 1726882392.57802: sending task result for task 0e448fcc-3ce9-896b-2321-000000000268 11661 1726882392.57898: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000268 11661 1726882392.57901: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11661 1726882392.57969: no more pending results, returning what we have 11661 1726882392.57973: results queue empty 11661 1726882392.57974: checking for any_errors_fatal 11661 1726882392.57984: done checking for any_errors_fatal 11661 1726882392.57985: checking for max_fail_percentage 11661 1726882392.57987: done checking for max_fail_percentage 11661 1726882392.57988: checking to see if all hosts have failed and the running result is not ok 11661 1726882392.57989: done checking to see if all hosts have failed 11661 1726882392.57990: getting the remaining hosts for this loop 11661 1726882392.57994: done getting the remaining hosts for this loop 11661 1726882392.57998: getting the next task for host managed_node2 11661 1726882392.58005: done getting next task for host managed_node2 11661 1726882392.58008: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11661 1726882392.58011: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882392.58015: getting variables 11661 1726882392.58017: in VariableManager get_vars() 11661 1726882392.58061: Calling all_inventory to load vars for managed_node2 11661 1726882392.58065: Calling groups_inventory to load vars for managed_node2 11661 1726882392.58068: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882392.58079: Calling all_plugins_play to load vars for managed_node2 11661 1726882392.58082: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882392.58086: Calling groups_plugins_play to load vars for managed_node2 11661 1726882392.60235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882392.61889: done with get_vars() 11661 1726882392.61917: done getting variables 11661 1726882392.61965: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882392.62048: variable 'profile' from source: include params 11661 1726882392.62051: variable 'item' from source: include params 11661 1726882392.62097: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:33:12 -0400 (0:00:00.089) 0:00:21.334 ****** 11661 1726882392.62125: entering _queue_task() for managed_node2/assert 11661 1726882392.62357: worker is 1 (out of 1 available) 11661 1726882392.62374: exiting _queue_task() for managed_node2/assert 11661 1726882392.62385: done queuing things up, now waiting for results queue to drain 11661 1726882392.62387: waiting for pending results... 11661 1726882392.62620: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0.0 11661 1726882392.62975: in run() - task 0e448fcc-3ce9-896b-2321-000000000269 11661 1726882392.62978: variable 'ansible_search_path' from source: unknown 11661 1726882392.62982: variable 'ansible_search_path' from source: unknown 11661 1726882392.62985: calling self._execute() 11661 1726882392.62988: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.62995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.62998: variable 'omit' from source: magic vars 11661 1726882392.63310: variable 'ansible_distribution_major_version' from source: facts 11661 1726882392.63325: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882392.63342: variable 'omit' from source: magic vars 11661 1726882392.63393: variable 'omit' from source: magic vars 11661 1726882392.63748: variable 'profile' from source: include params 11661 1726882392.63754: variable 'item' from source: include params 11661 1726882392.63822: variable 'item' from source: include params 11661 1726882392.63841: variable 'omit' from source: magic vars 11661 1726882392.63893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882392.63928: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882392.63948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882392.63968: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882392.63978: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882392.64012: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882392.64015: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.64018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.64119: Set connection var ansible_connection to ssh 11661 1726882392.64124: Set connection var ansible_pipelining to False 11661 1726882392.64130: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882392.64138: Set connection var ansible_timeout to 10 11661 1726882392.64141: Set connection var ansible_shell_type to sh 11661 1726882392.64148: Set connection var ansible_shell_executable to /bin/sh 11661 1726882392.64173: variable 'ansible_shell_executable' from source: unknown 11661 1726882392.64176: variable 'ansible_connection' from source: unknown 11661 1726882392.64179: variable 'ansible_module_compression' from source: unknown 11661 1726882392.64181: variable 'ansible_shell_type' from source: unknown 11661 1726882392.64184: variable 'ansible_shell_executable' from source: unknown 11661 1726882392.64186: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.64190: variable 'ansible_pipelining' from source: unknown 11661 1726882392.64193: variable 'ansible_timeout' from source: unknown 11661 1726882392.64195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.64668: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882392.64683: variable 'omit' from source: magic vars 11661 1726882392.64688: starting attempt loop 11661 1726882392.64691: running the handler 11661 1726882392.64822: variable 'lsr_net_profile_fingerprint' from source: set_fact 11661 1726882392.64825: Evaluated conditional (lsr_net_profile_fingerprint): True 11661 1726882392.64834: handler run complete 11661 1726882392.64848: attempt loop complete, returning result 11661 1726882392.64854: _execute() done 11661 1726882392.64858: dumping result to json 11661 1726882392.64874: done dumping result, returning 11661 1726882392.64896: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0.0 [0e448fcc-3ce9-896b-2321-000000000269] 11661 1726882392.64907: sending task result for task 0e448fcc-3ce9-896b-2321-000000000269 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11661 1726882392.65052: no more pending results, returning what we have 11661 1726882392.65056: results queue empty 11661 1726882392.65057: checking for any_errors_fatal 11661 1726882392.65067: done checking for any_errors_fatal 11661 1726882392.65068: checking for max_fail_percentage 11661 1726882392.65070: done checking for max_fail_percentage 11661 1726882392.65071: checking to see if all hosts have failed and the running result is not ok 11661 1726882392.65072: done checking to see if all hosts have failed 11661 1726882392.65072: getting the remaining hosts for this loop 11661 1726882392.65074: done getting the remaining hosts for this loop 11661 1726882392.65078: getting the next task for host managed_node2 11661 1726882392.65089: done getting next task for host managed_node2 11661 1726882392.65091: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11661 1726882392.65094: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882392.65098: getting variables 11661 1726882392.65100: in VariableManager get_vars() 11661 1726882392.65139: Calling all_inventory to load vars for managed_node2 11661 1726882392.65142: Calling groups_inventory to load vars for managed_node2 11661 1726882392.65144: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882392.65155: Calling all_plugins_play to load vars for managed_node2 11661 1726882392.65158: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882392.65162: Calling groups_plugins_play to load vars for managed_node2 11661 1726882392.65688: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000269 11661 1726882392.65691: WORKER PROCESS EXITING 11661 1726882392.66309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882392.68598: done with get_vars() 11661 1726882392.68629: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:33:12 -0400 (0:00:00.069) 0:00:21.403 ****** 11661 1726882392.69046: entering _queue_task() for managed_node2/include_tasks 11661 1726882392.69747: worker is 1 (out of 1 available) 11661 1726882392.69760: exiting _queue_task() for managed_node2/include_tasks 11661 1726882392.69774: done queuing things up, now waiting for results queue to drain 11661 1726882392.69776: waiting for pending results... 11661 1726882392.70065: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 11661 1726882392.70171: in run() - task 0e448fcc-3ce9-896b-2321-00000000026d 11661 1726882392.70186: variable 'ansible_search_path' from source: unknown 11661 1726882392.70191: variable 'ansible_search_path' from source: unknown 11661 1726882392.70231: calling self._execute() 11661 1726882392.70328: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.70334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.70346: variable 'omit' from source: magic vars 11661 1726882392.70718: variable 'ansible_distribution_major_version' from source: facts 11661 1726882392.70730: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882392.70737: _execute() done 11661 1726882392.70741: dumping result to json 11661 1726882392.70743: done dumping result, returning 11661 1726882392.70749: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-896b-2321-00000000026d] 11661 1726882392.70754: sending task result for task 0e448fcc-3ce9-896b-2321-00000000026d 11661 1726882392.70858: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000026d 11661 1726882392.70861: WORKER PROCESS EXITING 11661 1726882392.70893: no more pending results, returning what we have 11661 1726882392.70899: in VariableManager get_vars() 11661 1726882392.70948: Calling all_inventory to load vars for managed_node2 11661 1726882392.70951: Calling groups_inventory to load vars for managed_node2 11661 1726882392.70954: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882392.70970: Calling all_plugins_play to load vars for managed_node2 11661 1726882392.70974: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882392.70977: Calling groups_plugins_play to load vars for managed_node2 11661 1726882392.72728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882392.74484: done with get_vars() 11661 1726882392.74508: variable 'ansible_search_path' from source: unknown 11661 1726882392.74510: variable 'ansible_search_path' from source: unknown 11661 1726882392.74546: we have included files to process 11661 1726882392.74547: generating all_blocks data 11661 1726882392.74549: done generating all_blocks data 11661 1726882392.74554: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11661 1726882392.74555: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11661 1726882392.74558: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11661 1726882392.76207: done processing included file 11661 1726882392.76209: iterating over new_blocks loaded from include file 11661 1726882392.76211: in VariableManager get_vars() 11661 1726882392.76268: done with get_vars() 11661 1726882392.76276: filtering new block on tags 11661 1726882392.76303: done filtering new block on tags 11661 1726882392.76305: in VariableManager get_vars() 11661 1726882392.76326: done with get_vars() 11661 1726882392.76328: filtering new block on tags 11661 1726882392.76349: done filtering new block on tags 11661 1726882392.76352: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 11661 1726882392.76357: extending task lists for all hosts with included blocks 11661 1726882392.76548: done extending task lists 11661 1726882392.76549: done processing included files 11661 1726882392.76550: results queue empty 11661 1726882392.76551: checking for any_errors_fatal 11661 1726882392.76555: done checking for any_errors_fatal 11661 1726882392.76556: checking for max_fail_percentage 11661 1726882392.76557: done checking for max_fail_percentage 11661 1726882392.76558: checking to see if all hosts have failed and the running result is not ok 11661 1726882392.76559: done checking to see if all hosts have failed 11661 1726882392.76560: getting the remaining hosts for this loop 11661 1726882392.76561: done getting the remaining hosts for this loop 11661 1726882392.76565: getting the next task for host managed_node2 11661 1726882392.76569: done getting next task for host managed_node2 11661 1726882392.76571: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11661 1726882392.76574: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882392.76577: getting variables 11661 1726882392.76578: in VariableManager get_vars() 11661 1726882392.76590: Calling all_inventory to load vars for managed_node2 11661 1726882392.76593: Calling groups_inventory to load vars for managed_node2 11661 1726882392.76595: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882392.76600: Calling all_plugins_play to load vars for managed_node2 11661 1726882392.76603: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882392.76605: Calling groups_plugins_play to load vars for managed_node2 11661 1726882392.77902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882392.79551: done with get_vars() 11661 1726882392.79582: done getting variables 11661 1726882392.79629: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:33:12 -0400 (0:00:00.106) 0:00:21.510 ****** 11661 1726882392.79669: entering _queue_task() for managed_node2/set_fact 11661 1726882392.80004: worker is 1 (out of 1 available) 11661 1726882392.80016: exiting _queue_task() for managed_node2/set_fact 11661 1726882392.80027: done queuing things up, now waiting for results queue to drain 11661 1726882392.80029: waiting for pending results... 11661 1726882392.80309: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 11661 1726882392.80415: in run() - task 0e448fcc-3ce9-896b-2321-000000000440 11661 1726882392.80428: variable 'ansible_search_path' from source: unknown 11661 1726882392.80431: variable 'ansible_search_path' from source: unknown 11661 1726882392.80466: calling self._execute() 11661 1726882392.80558: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.80562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.80578: variable 'omit' from source: magic vars 11661 1726882392.80945: variable 'ansible_distribution_major_version' from source: facts 11661 1726882392.80958: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882392.80967: variable 'omit' from source: magic vars 11661 1726882392.81009: variable 'omit' from source: magic vars 11661 1726882392.81048: variable 'omit' from source: magic vars 11661 1726882392.81090: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882392.81129: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882392.81150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882392.81169: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882392.81181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882392.81211: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882392.81214: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.81218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.81317: Set connection var ansible_connection to ssh 11661 1726882392.81321: Set connection var ansible_pipelining to False 11661 1726882392.81329: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882392.81335: Set connection var ansible_timeout to 10 11661 1726882392.81337: Set connection var ansible_shell_type to sh 11661 1726882392.81355: Set connection var ansible_shell_executable to /bin/sh 11661 1726882392.81376: variable 'ansible_shell_executable' from source: unknown 11661 1726882392.81379: variable 'ansible_connection' from source: unknown 11661 1726882392.81382: variable 'ansible_module_compression' from source: unknown 11661 1726882392.81384: variable 'ansible_shell_type' from source: unknown 11661 1726882392.81387: variable 'ansible_shell_executable' from source: unknown 11661 1726882392.81389: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.81391: variable 'ansible_pipelining' from source: unknown 11661 1726882392.81395: variable 'ansible_timeout' from source: unknown 11661 1726882392.81399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.81536: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882392.81546: variable 'omit' from source: magic vars 11661 1726882392.81552: starting attempt loop 11661 1726882392.81555: running the handler 11661 1726882392.81573: handler run complete 11661 1726882392.81583: attempt loop complete, returning result 11661 1726882392.81586: _execute() done 11661 1726882392.81588: dumping result to json 11661 1726882392.81590: done dumping result, returning 11661 1726882392.81599: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-896b-2321-000000000440] 11661 1726882392.81604: sending task result for task 0e448fcc-3ce9-896b-2321-000000000440 ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11661 1726882392.81745: no more pending results, returning what we have 11661 1726882392.81749: results queue empty 11661 1726882392.81750: checking for any_errors_fatal 11661 1726882392.81752: done checking for any_errors_fatal 11661 1726882392.81753: checking for max_fail_percentage 11661 1726882392.81755: done checking for max_fail_percentage 11661 1726882392.81756: checking to see if all hosts have failed and the running result is not ok 11661 1726882392.81757: done checking to see if all hosts have failed 11661 1726882392.81757: getting the remaining hosts for this loop 11661 1726882392.81759: done getting the remaining hosts for this loop 11661 1726882392.81765: getting the next task for host managed_node2 11661 1726882392.81772: done getting next task for host managed_node2 11661 1726882392.81775: ^ task is: TASK: Stat profile file 11661 1726882392.81779: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882392.81783: getting variables 11661 1726882392.81785: in VariableManager get_vars() 11661 1726882392.81828: Calling all_inventory to load vars for managed_node2 11661 1726882392.81831: Calling groups_inventory to load vars for managed_node2 11661 1726882392.81833: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882392.81845: Calling all_plugins_play to load vars for managed_node2 11661 1726882392.81849: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882392.81853: Calling groups_plugins_play to load vars for managed_node2 11661 1726882392.82368: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000440 11661 1726882392.82372: WORKER PROCESS EXITING 11661 1726882392.83630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882392.85290: done with get_vars() 11661 1726882392.85318: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:33:12 -0400 (0:00:00.057) 0:00:21.567 ****** 11661 1726882392.85413: entering _queue_task() for managed_node2/stat 11661 1726882392.85734: worker is 1 (out of 1 available) 11661 1726882392.85744: exiting _queue_task() for managed_node2/stat 11661 1726882392.85755: done queuing things up, now waiting for results queue to drain 11661 1726882392.85757: waiting for pending results... 11661 1726882392.86033: running TaskExecutor() for managed_node2/TASK: Stat profile file 11661 1726882392.86126: in run() - task 0e448fcc-3ce9-896b-2321-000000000441 11661 1726882392.86139: variable 'ansible_search_path' from source: unknown 11661 1726882392.86143: variable 'ansible_search_path' from source: unknown 11661 1726882392.86177: calling self._execute() 11661 1726882392.86276: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.86280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.86290: variable 'omit' from source: magic vars 11661 1726882392.86662: variable 'ansible_distribution_major_version' from source: facts 11661 1726882392.86675: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882392.86681: variable 'omit' from source: magic vars 11661 1726882392.86727: variable 'omit' from source: magic vars 11661 1726882392.86826: variable 'profile' from source: include params 11661 1726882392.86830: variable 'item' from source: include params 11661 1726882392.86896: variable 'item' from source: include params 11661 1726882392.86914: variable 'omit' from source: magic vars 11661 1726882392.86958: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882392.86996: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882392.87015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882392.87032: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882392.87043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882392.87080: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882392.87083: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.87086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.87186: Set connection var ansible_connection to ssh 11661 1726882392.87191: Set connection var ansible_pipelining to False 11661 1726882392.87197: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882392.87204: Set connection var ansible_timeout to 10 11661 1726882392.87207: Set connection var ansible_shell_type to sh 11661 1726882392.87214: Set connection var ansible_shell_executable to /bin/sh 11661 1726882392.87236: variable 'ansible_shell_executable' from source: unknown 11661 1726882392.87239: variable 'ansible_connection' from source: unknown 11661 1726882392.87242: variable 'ansible_module_compression' from source: unknown 11661 1726882392.87246: variable 'ansible_shell_type' from source: unknown 11661 1726882392.87248: variable 'ansible_shell_executable' from source: unknown 11661 1726882392.87252: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882392.87255: variable 'ansible_pipelining' from source: unknown 11661 1726882392.87258: variable 'ansible_timeout' from source: unknown 11661 1726882392.87260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882392.87457: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11661 1726882392.87468: variable 'omit' from source: magic vars 11661 1726882392.87475: starting attempt loop 11661 1726882392.87479: running the handler 11661 1726882392.87491: _low_level_execute_command(): starting 11661 1726882392.87504: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882392.88257: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882392.88276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882392.88287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882392.88302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882392.88342: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882392.88353: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882392.88362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882392.88378: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882392.88387: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882392.88395: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882392.88403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882392.88412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882392.88424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882392.88431: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882392.88438: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882392.88447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882392.88523: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882392.88543: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882392.88556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882392.88692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882392.90374: stdout chunk (state=3): >>>/root <<< 11661 1726882392.90478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882392.90576: stderr chunk (state=3): >>><<< 11661 1726882392.90589: stdout chunk (state=3): >>><<< 11661 1726882392.90718: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882392.90721: _low_level_execute_command(): starting 11661 1726882392.90725: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882392.9061968-12634-115797838792870 `" && echo ansible-tmp-1726882392.9061968-12634-115797838792870="` echo /root/.ansible/tmp/ansible-tmp-1726882392.9061968-12634-115797838792870 `" ) && sleep 0' 11661 1726882392.91518: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882392.91532: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882392.91545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882392.91562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882392.91611: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882392.91622: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882392.91635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882392.91651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882392.91666: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882392.91677: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882392.91690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882392.91702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882392.91720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882392.91732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882392.91742: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882392.91755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882392.91828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882392.91854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882392.91874: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882392.92006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882392.93960: stdout chunk (state=3): >>>ansible-tmp-1726882392.9061968-12634-115797838792870=/root/.ansible/tmp/ansible-tmp-1726882392.9061968-12634-115797838792870 <<< 11661 1726882392.94170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882392.94174: stdout chunk (state=3): >>><<< 11661 1726882392.94176: stderr chunk (state=3): >>><<< 11661 1726882392.94381: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882392.9061968-12634-115797838792870=/root/.ansible/tmp/ansible-tmp-1726882392.9061968-12634-115797838792870 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882392.94384: variable 'ansible_module_compression' from source: unknown 11661 1726882392.94386: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11661 1726882392.94389: variable 'ansible_facts' from source: unknown 11661 1726882392.94446: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882392.9061968-12634-115797838792870/AnsiballZ_stat.py 11661 1726882392.94615: Sending initial data 11661 1726882392.94618: Sent initial data (153 bytes) 11661 1726882392.96220: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882392.96238: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882392.96254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882392.96283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882392.96326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882392.96339: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882392.96352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882392.96372: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882392.96383: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882392.96397: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882392.96409: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882392.96421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882392.96435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882392.96447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882392.96459: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882392.96477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882392.96557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882392.96583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882392.96598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882392.96838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882392.98548: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882392.98641: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882392.98742: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpthvmpm49 /root/.ansible/tmp/ansible-tmp-1726882392.9061968-12634-115797838792870/AnsiballZ_stat.py <<< 11661 1726882392.98839: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882393.00367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882393.00478: stderr chunk (state=3): >>><<< 11661 1726882393.00481: stdout chunk (state=3): >>><<< 11661 1726882393.00483: done transferring module to remote 11661 1726882393.00489: _low_level_execute_command(): starting 11661 1726882393.00491: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882392.9061968-12634-115797838792870/ /root/.ansible/tmp/ansible-tmp-1726882392.9061968-12634-115797838792870/AnsiballZ_stat.py && sleep 0' 11661 1726882393.01172: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882393.01175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882393.01224: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882393.01228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882393.01231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882393.01296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882393.01309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882393.01433: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882393.03246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882393.03327: stderr chunk (state=3): >>><<< 11661 1726882393.03331: stdout chunk (state=3): >>><<< 11661 1726882393.03430: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882393.03434: _low_level_execute_command(): starting 11661 1726882393.03436: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882392.9061968-12634-115797838792870/AnsiballZ_stat.py && sleep 0' 11661 1726882393.04646: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882393.04672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882393.04676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882393.04696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882393.04732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882393.04738: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882393.04747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882393.04767: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882393.04775: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882393.04781: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882393.04789: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882393.04797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882393.04815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882393.04824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882393.04833: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882393.04843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882393.04916: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882393.04941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882393.05078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882393.18262: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11661 1726882393.19282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882393.19333: stderr chunk (state=3): >>><<< 11661 1726882393.19337: stdout chunk (state=3): >>><<< 11661 1726882393.19357: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882393.19382: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882392.9061968-12634-115797838792870/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882393.19390: _low_level_execute_command(): starting 11661 1726882393.19395: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882392.9061968-12634-115797838792870/ > /dev/null 2>&1 && sleep 0' 11661 1726882393.19823: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882393.19828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882393.19880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882393.19883: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882393.19886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 11661 1726882393.19888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882393.19945: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882393.19952: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882393.19954: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882393.20050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882393.21873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882393.21919: stderr chunk (state=3): >>><<< 11661 1726882393.21922: stdout chunk (state=3): >>><<< 11661 1726882393.21937: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882393.21945: handler run complete 11661 1726882393.21961: attempt loop complete, returning result 11661 1726882393.21966: _execute() done 11661 1726882393.21968: dumping result to json 11661 1726882393.21970: done dumping result, returning 11661 1726882393.21978: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0e448fcc-3ce9-896b-2321-000000000441] 11661 1726882393.21983: sending task result for task 0e448fcc-3ce9-896b-2321-000000000441 ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 11661 1726882393.22134: no more pending results, returning what we have 11661 1726882393.22137: results queue empty 11661 1726882393.22138: checking for any_errors_fatal 11661 1726882393.22145: done checking for any_errors_fatal 11661 1726882393.22146: checking for max_fail_percentage 11661 1726882393.22147: done checking for max_fail_percentage 11661 1726882393.22148: checking to see if all hosts have failed and the running result is not ok 11661 1726882393.22149: done checking to see if all hosts have failed 11661 1726882393.22149: getting the remaining hosts for this loop 11661 1726882393.22153: done getting the remaining hosts for this loop 11661 1726882393.22156: getting the next task for host managed_node2 11661 1726882393.22163: done getting next task for host managed_node2 11661 1726882393.22167: ^ task is: TASK: Set NM profile exist flag based on the profile files 11661 1726882393.22171: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882393.22175: getting variables 11661 1726882393.22176: in VariableManager get_vars() 11661 1726882393.22218: Calling all_inventory to load vars for managed_node2 11661 1726882393.22221: Calling groups_inventory to load vars for managed_node2 11661 1726882393.22223: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882393.22234: Calling all_plugins_play to load vars for managed_node2 11661 1726882393.22236: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882393.22238: Calling groups_plugins_play to load vars for managed_node2 11661 1726882393.22868: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000441 11661 1726882393.22872: WORKER PROCESS EXITING 11661 1726882393.23081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882393.24017: done with get_vars() 11661 1726882393.24033: done getting variables 11661 1726882393.24078: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:33:13 -0400 (0:00:00.386) 0:00:21.954 ****** 11661 1726882393.24102: entering _queue_task() for managed_node2/set_fact 11661 1726882393.24310: worker is 1 (out of 1 available) 11661 1726882393.24322: exiting _queue_task() for managed_node2/set_fact 11661 1726882393.24335: done queuing things up, now waiting for results queue to drain 11661 1726882393.24337: waiting for pending results... 11661 1726882393.24514: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 11661 1726882393.24596: in run() - task 0e448fcc-3ce9-896b-2321-000000000442 11661 1726882393.24607: variable 'ansible_search_path' from source: unknown 11661 1726882393.24610: variable 'ansible_search_path' from source: unknown 11661 1726882393.24637: calling self._execute() 11661 1726882393.24713: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882393.24719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882393.24728: variable 'omit' from source: magic vars 11661 1726882393.24998: variable 'ansible_distribution_major_version' from source: facts 11661 1726882393.25009: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882393.25100: variable 'profile_stat' from source: set_fact 11661 1726882393.25113: Evaluated conditional (profile_stat.stat.exists): False 11661 1726882393.25116: when evaluation is False, skipping this task 11661 1726882393.25119: _execute() done 11661 1726882393.25121: dumping result to json 11661 1726882393.25124: done dumping result, returning 11661 1726882393.25129: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-896b-2321-000000000442] 11661 1726882393.25136: sending task result for task 0e448fcc-3ce9-896b-2321-000000000442 11661 1726882393.25219: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000442 11661 1726882393.25222: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11661 1726882393.25281: no more pending results, returning what we have 11661 1726882393.25285: results queue empty 11661 1726882393.25286: checking for any_errors_fatal 11661 1726882393.25293: done checking for any_errors_fatal 11661 1726882393.25293: checking for max_fail_percentage 11661 1726882393.25295: done checking for max_fail_percentage 11661 1726882393.25296: checking to see if all hosts have failed and the running result is not ok 11661 1726882393.25297: done checking to see if all hosts have failed 11661 1726882393.25297: getting the remaining hosts for this loop 11661 1726882393.25299: done getting the remaining hosts for this loop 11661 1726882393.25302: getting the next task for host managed_node2 11661 1726882393.25309: done getting next task for host managed_node2 11661 1726882393.25311: ^ task is: TASK: Get NM profile info 11661 1726882393.25315: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882393.25319: getting variables 11661 1726882393.25320: in VariableManager get_vars() 11661 1726882393.25358: Calling all_inventory to load vars for managed_node2 11661 1726882393.25361: Calling groups_inventory to load vars for managed_node2 11661 1726882393.25363: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882393.25375: Calling all_plugins_play to load vars for managed_node2 11661 1726882393.25377: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882393.25380: Calling groups_plugins_play to load vars for managed_node2 11661 1726882393.29869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882393.30801: done with get_vars() 11661 1726882393.30820: done getting variables 11661 1726882393.30861: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:33:13 -0400 (0:00:00.067) 0:00:22.022 ****** 11661 1726882393.30883: entering _queue_task() for managed_node2/shell 11661 1726882393.31123: worker is 1 (out of 1 available) 11661 1726882393.31136: exiting _queue_task() for managed_node2/shell 11661 1726882393.31148: done queuing things up, now waiting for results queue to drain 11661 1726882393.31153: waiting for pending results... 11661 1726882393.31328: running TaskExecutor() for managed_node2/TASK: Get NM profile info 11661 1726882393.31412: in run() - task 0e448fcc-3ce9-896b-2321-000000000443 11661 1726882393.31424: variable 'ansible_search_path' from source: unknown 11661 1726882393.31428: variable 'ansible_search_path' from source: unknown 11661 1726882393.31457: calling self._execute() 11661 1726882393.31543: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882393.31547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882393.31560: variable 'omit' from source: magic vars 11661 1726882393.31848: variable 'ansible_distribution_major_version' from source: facts 11661 1726882393.31858: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882393.31867: variable 'omit' from source: magic vars 11661 1726882393.31901: variable 'omit' from source: magic vars 11661 1726882393.31975: variable 'profile' from source: include params 11661 1726882393.31980: variable 'item' from source: include params 11661 1726882393.32027: variable 'item' from source: include params 11661 1726882393.32042: variable 'omit' from source: magic vars 11661 1726882393.32081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882393.32109: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882393.32126: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882393.32139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882393.32155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882393.32177: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882393.32180: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882393.32183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882393.32253: Set connection var ansible_connection to ssh 11661 1726882393.32256: Set connection var ansible_pipelining to False 11661 1726882393.32262: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882393.32270: Set connection var ansible_timeout to 10 11661 1726882393.32273: Set connection var ansible_shell_type to sh 11661 1726882393.32280: Set connection var ansible_shell_executable to /bin/sh 11661 1726882393.32296: variable 'ansible_shell_executable' from source: unknown 11661 1726882393.32299: variable 'ansible_connection' from source: unknown 11661 1726882393.32302: variable 'ansible_module_compression' from source: unknown 11661 1726882393.32304: variable 'ansible_shell_type' from source: unknown 11661 1726882393.32306: variable 'ansible_shell_executable' from source: unknown 11661 1726882393.32309: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882393.32312: variable 'ansible_pipelining' from source: unknown 11661 1726882393.32315: variable 'ansible_timeout' from source: unknown 11661 1726882393.32319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882393.32416: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882393.32426: variable 'omit' from source: magic vars 11661 1726882393.32430: starting attempt loop 11661 1726882393.32434: running the handler 11661 1726882393.32443: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882393.32458: _low_level_execute_command(): starting 11661 1726882393.32467: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882393.33003: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882393.33019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882393.33039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882393.33053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882393.33104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882393.33116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882393.33228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882393.34899: stdout chunk (state=3): >>>/root <<< 11661 1726882393.35000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882393.35060: stderr chunk (state=3): >>><<< 11661 1726882393.35066: stdout chunk (state=3): >>><<< 11661 1726882393.35094: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882393.35106: _low_level_execute_command(): starting 11661 1726882393.35112: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882393.350937-12662-275301437150300 `" && echo ansible-tmp-1726882393.350937-12662-275301437150300="` echo /root/.ansible/tmp/ansible-tmp-1726882393.350937-12662-275301437150300 `" ) && sleep 0' 11661 1726882393.35589: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882393.35603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882393.35629: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882393.35633: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882393.35682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882393.35690: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882393.35809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882393.37687: stdout chunk (state=3): >>>ansible-tmp-1726882393.350937-12662-275301437150300=/root/.ansible/tmp/ansible-tmp-1726882393.350937-12662-275301437150300 <<< 11661 1726882393.37796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882393.38088: stderr chunk (state=3): >>><<< 11661 1726882393.38091: stdout chunk (state=3): >>><<< 11661 1726882393.38094: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882393.350937-12662-275301437150300=/root/.ansible/tmp/ansible-tmp-1726882393.350937-12662-275301437150300 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882393.38097: variable 'ansible_module_compression' from source: unknown 11661 1726882393.38100: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11661 1726882393.38102: variable 'ansible_facts' from source: unknown 11661 1726882393.38116: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882393.350937-12662-275301437150300/AnsiballZ_command.py 11661 1726882393.38293: Sending initial data 11661 1726882393.38296: Sent initial data (155 bytes) 11661 1726882393.39305: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882393.39309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882393.39338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882393.39342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 11661 1726882393.39345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882393.39401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882393.39408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882393.39411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882393.39508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882393.41300: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882393.41393: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882393.41494: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpp5h_l7zu /root/.ansible/tmp/ansible-tmp-1726882393.350937-12662-275301437150300/AnsiballZ_command.py <<< 11661 1726882393.41589: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882393.42958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882393.43133: stderr chunk (state=3): >>><<< 11661 1726882393.43137: stdout chunk (state=3): >>><<< 11661 1726882393.43139: done transferring module to remote 11661 1726882393.43142: _low_level_execute_command(): starting 11661 1726882393.43144: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882393.350937-12662-275301437150300/ /root/.ansible/tmp/ansible-tmp-1726882393.350937-12662-275301437150300/AnsiballZ_command.py && sleep 0' 11661 1726882393.43707: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882393.43720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882393.43733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882393.43750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882393.43795: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882393.43806: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882393.43819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882393.43836: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882393.43847: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882393.43857: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882393.43875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882393.43893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882393.43910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882393.43921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882393.43931: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882393.43943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882393.44021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882393.44043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882393.44058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882393.44191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882393.46100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882393.46132: stderr chunk (state=3): >>><<< 11661 1726882393.46135: stdout chunk (state=3): >>><<< 11661 1726882393.46239: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882393.46243: _low_level_execute_command(): starting 11661 1726882393.46247: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882393.350937-12662-275301437150300/AnsiballZ_command.py && sleep 0' 11661 1726882393.46972: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882393.46976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882393.47026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882393.47029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882393.47032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882393.47088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882393.47091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882393.47097: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882393.47207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882393.62644: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 21:33:13.601167", "end": "2024-09-20 21:33:13.624110", "delta": "0:00:00.022943", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11661 1726882393.63987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882393.64006: stderr chunk (state=3): >>><<< 11661 1726882393.64010: stdout chunk (state=3): >>><<< 11661 1726882393.64029: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 21:33:13.601167", "end": "2024-09-20 21:33:13.624110", "delta": "0:00:00.022943", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882393.64068: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882393.350937-12662-275301437150300/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882393.64074: _low_level_execute_command(): starting 11661 1726882393.64080: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882393.350937-12662-275301437150300/ > /dev/null 2>&1 && sleep 0' 11661 1726882393.64735: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882393.64744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882393.64756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882393.64770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882393.64815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882393.64826: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882393.64836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882393.64850: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882393.64857: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882393.64867: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882393.64877: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882393.64887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882393.64899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882393.64909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882393.64918: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882393.64929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882393.64997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882393.65015: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882393.65031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882393.65162: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882393.66970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882393.67012: stderr chunk (state=3): >>><<< 11661 1726882393.67016: stdout chunk (state=3): >>><<< 11661 1726882393.67028: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882393.67034: handler run complete 11661 1726882393.67054: Evaluated conditional (False): False 11661 1726882393.67061: attempt loop complete, returning result 11661 1726882393.67065: _execute() done 11661 1726882393.67067: dumping result to json 11661 1726882393.67075: done dumping result, returning 11661 1726882393.67086: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0e448fcc-3ce9-896b-2321-000000000443] 11661 1726882393.67091: sending task result for task 0e448fcc-3ce9-896b-2321-000000000443 11661 1726882393.67181: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000443 11661 1726882393.67183: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.022943", "end": "2024-09-20 21:33:13.624110", "rc": 0, "start": "2024-09-20 21:33:13.601167" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 11661 1726882393.67250: no more pending results, returning what we have 11661 1726882393.67256: results queue empty 11661 1726882393.67256: checking for any_errors_fatal 11661 1726882393.67262: done checking for any_errors_fatal 11661 1726882393.67265: checking for max_fail_percentage 11661 1726882393.67266: done checking for max_fail_percentage 11661 1726882393.67267: checking to see if all hosts have failed and the running result is not ok 11661 1726882393.67268: done checking to see if all hosts have failed 11661 1726882393.67268: getting the remaining hosts for this loop 11661 1726882393.67270: done getting the remaining hosts for this loop 11661 1726882393.67273: getting the next task for host managed_node2 11661 1726882393.67280: done getting next task for host managed_node2 11661 1726882393.67282: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11661 1726882393.67286: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882393.67290: getting variables 11661 1726882393.67291: in VariableManager get_vars() 11661 1726882393.67331: Calling all_inventory to load vars for managed_node2 11661 1726882393.67334: Calling groups_inventory to load vars for managed_node2 11661 1726882393.67335: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882393.67345: Calling all_plugins_play to load vars for managed_node2 11661 1726882393.67347: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882393.67350: Calling groups_plugins_play to load vars for managed_node2 11661 1726882393.68187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882393.70007: done with get_vars() 11661 1726882393.70028: done getting variables 11661 1726882393.70090: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:33:13 -0400 (0:00:00.392) 0:00:22.414 ****** 11661 1726882393.70120: entering _queue_task() for managed_node2/set_fact 11661 1726882393.70398: worker is 1 (out of 1 available) 11661 1726882393.70408: exiting _queue_task() for managed_node2/set_fact 11661 1726882393.70421: done queuing things up, now waiting for results queue to drain 11661 1726882393.70423: waiting for pending results... 11661 1726882393.70697: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11661 1726882393.70822: in run() - task 0e448fcc-3ce9-896b-2321-000000000444 11661 1726882393.70842: variable 'ansible_search_path' from source: unknown 11661 1726882393.70850: variable 'ansible_search_path' from source: unknown 11661 1726882393.70895: calling self._execute() 11661 1726882393.70999: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882393.71015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882393.71028: variable 'omit' from source: magic vars 11661 1726882393.71414: variable 'ansible_distribution_major_version' from source: facts 11661 1726882393.71430: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882393.71573: variable 'nm_profile_exists' from source: set_fact 11661 1726882393.71590: Evaluated conditional (nm_profile_exists.rc == 0): True 11661 1726882393.71600: variable 'omit' from source: magic vars 11661 1726882393.71656: variable 'omit' from source: magic vars 11661 1726882393.71693: variable 'omit' from source: magic vars 11661 1726882393.71741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882393.71784: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882393.71807: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882393.71829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882393.71847: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882393.71883: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882393.71891: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882393.71898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882393.72003: Set connection var ansible_connection to ssh 11661 1726882393.72013: Set connection var ansible_pipelining to False 11661 1726882393.72022: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882393.72033: Set connection var ansible_timeout to 10 11661 1726882393.72040: Set connection var ansible_shell_type to sh 11661 1726882393.72058: Set connection var ansible_shell_executable to /bin/sh 11661 1726882393.72084: variable 'ansible_shell_executable' from source: unknown 11661 1726882393.72091: variable 'ansible_connection' from source: unknown 11661 1726882393.72098: variable 'ansible_module_compression' from source: unknown 11661 1726882393.72104: variable 'ansible_shell_type' from source: unknown 11661 1726882393.72110: variable 'ansible_shell_executable' from source: unknown 11661 1726882393.72115: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882393.72123: variable 'ansible_pipelining' from source: unknown 11661 1726882393.72128: variable 'ansible_timeout' from source: unknown 11661 1726882393.72136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882393.72286: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882393.72300: variable 'omit' from source: magic vars 11661 1726882393.72308: starting attempt loop 11661 1726882393.72314: running the handler 11661 1726882393.72329: handler run complete 11661 1726882393.72345: attempt loop complete, returning result 11661 1726882393.72353: _execute() done 11661 1726882393.72360: dumping result to json 11661 1726882393.72369: done dumping result, returning 11661 1726882393.72385: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-896b-2321-000000000444] 11661 1726882393.72395: sending task result for task 0e448fcc-3ce9-896b-2321-000000000444 ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11661 1726882393.72534: no more pending results, returning what we have 11661 1726882393.72538: results queue empty 11661 1726882393.72539: checking for any_errors_fatal 11661 1726882393.72547: done checking for any_errors_fatal 11661 1726882393.72547: checking for max_fail_percentage 11661 1726882393.72549: done checking for max_fail_percentage 11661 1726882393.72550: checking to see if all hosts have failed and the running result is not ok 11661 1726882393.72553: done checking to see if all hosts have failed 11661 1726882393.72554: getting the remaining hosts for this loop 11661 1726882393.72557: done getting the remaining hosts for this loop 11661 1726882393.72560: getting the next task for host managed_node2 11661 1726882393.72572: done getting next task for host managed_node2 11661 1726882393.72575: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11661 1726882393.72579: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882393.72583: getting variables 11661 1726882393.72584: in VariableManager get_vars() 11661 1726882393.72623: Calling all_inventory to load vars for managed_node2 11661 1726882393.72625: Calling groups_inventory to load vars for managed_node2 11661 1726882393.72628: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882393.72638: Calling all_plugins_play to load vars for managed_node2 11661 1726882393.72642: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882393.72645: Calling groups_plugins_play to load vars for managed_node2 11661 1726882393.74045: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000444 11661 1726882393.74049: WORKER PROCESS EXITING 11661 1726882393.74394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882393.77365: done with get_vars() 11661 1726882393.77397: done getting variables 11661 1726882393.77454: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882393.77581: variable 'profile' from source: include params 11661 1726882393.77585: variable 'item' from source: include params 11661 1726882393.77651: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:33:13 -0400 (0:00:00.075) 0:00:22.490 ****** 11661 1726882393.77690: entering _queue_task() for managed_node2/command 11661 1726882393.78002: worker is 1 (out of 1 available) 11661 1726882393.78013: exiting _queue_task() for managed_node2/command 11661 1726882393.78025: done queuing things up, now waiting for results queue to drain 11661 1726882393.78026: waiting for pending results... 11661 1726882393.78310: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0.1 11661 1726882393.78417: in run() - task 0e448fcc-3ce9-896b-2321-000000000446 11661 1726882393.78430: variable 'ansible_search_path' from source: unknown 11661 1726882393.78434: variable 'ansible_search_path' from source: unknown 11661 1726882393.78468: calling self._execute() 11661 1726882393.78563: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882393.78570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882393.78580: variable 'omit' from source: magic vars 11661 1726882393.80354: variable 'ansible_distribution_major_version' from source: facts 11661 1726882393.80365: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882393.80487: variable 'profile_stat' from source: set_fact 11661 1726882393.80499: Evaluated conditional (profile_stat.stat.exists): False 11661 1726882393.80503: when evaluation is False, skipping this task 11661 1726882393.80505: _execute() done 11661 1726882393.80508: dumping result to json 11661 1726882393.80510: done dumping result, returning 11661 1726882393.80517: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [0e448fcc-3ce9-896b-2321-000000000446] 11661 1726882393.80522: sending task result for task 0e448fcc-3ce9-896b-2321-000000000446 11661 1726882393.80612: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000446 11661 1726882393.80616: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11661 1726882393.80669: no more pending results, returning what we have 11661 1726882393.80673: results queue empty 11661 1726882393.80674: checking for any_errors_fatal 11661 1726882393.80681: done checking for any_errors_fatal 11661 1726882393.80681: checking for max_fail_percentage 11661 1726882393.80683: done checking for max_fail_percentage 11661 1726882393.80684: checking to see if all hosts have failed and the running result is not ok 11661 1726882393.80684: done checking to see if all hosts have failed 11661 1726882393.80685: getting the remaining hosts for this loop 11661 1726882393.80687: done getting the remaining hosts for this loop 11661 1726882393.80691: getting the next task for host managed_node2 11661 1726882393.80698: done getting next task for host managed_node2 11661 1726882393.80700: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11661 1726882393.80704: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882393.80708: getting variables 11661 1726882393.80709: in VariableManager get_vars() 11661 1726882393.80742: Calling all_inventory to load vars for managed_node2 11661 1726882393.80744: Calling groups_inventory to load vars for managed_node2 11661 1726882393.80746: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882393.80758: Calling all_plugins_play to load vars for managed_node2 11661 1726882393.80760: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882393.80762: Calling groups_plugins_play to load vars for managed_node2 11661 1726882393.83855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882393.88190: done with get_vars() 11661 1726882393.88221: done getting variables 11661 1726882393.88285: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882393.88400: variable 'profile' from source: include params 11661 1726882393.88404: variable 'item' from source: include params 11661 1726882393.88465: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:33:13 -0400 (0:00:00.108) 0:00:22.598 ****** 11661 1726882393.88496: entering _queue_task() for managed_node2/set_fact 11661 1726882393.89504: worker is 1 (out of 1 available) 11661 1726882393.89516: exiting _queue_task() for managed_node2/set_fact 11661 1726882393.89529: done queuing things up, now waiting for results queue to drain 11661 1726882393.89530: waiting for pending results... 11661 1726882393.89959: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 11661 1726882393.90098: in run() - task 0e448fcc-3ce9-896b-2321-000000000447 11661 1726882393.90125: variable 'ansible_search_path' from source: unknown 11661 1726882393.90134: variable 'ansible_search_path' from source: unknown 11661 1726882393.90180: calling self._execute() 11661 1726882393.90310: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882393.90330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882393.90344: variable 'omit' from source: magic vars 11661 1726882393.90767: variable 'ansible_distribution_major_version' from source: facts 11661 1726882393.90786: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882393.90919: variable 'profile_stat' from source: set_fact 11661 1726882393.90937: Evaluated conditional (profile_stat.stat.exists): False 11661 1726882393.90944: when evaluation is False, skipping this task 11661 1726882393.90955: _execute() done 11661 1726882393.90967: dumping result to json 11661 1726882393.90976: done dumping result, returning 11661 1726882393.90989: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [0e448fcc-3ce9-896b-2321-000000000447] 11661 1726882393.91021: sending task result for task 0e448fcc-3ce9-896b-2321-000000000447 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11661 1726882393.91182: no more pending results, returning what we have 11661 1726882393.91187: results queue empty 11661 1726882393.91189: checking for any_errors_fatal 11661 1726882393.91195: done checking for any_errors_fatal 11661 1726882393.91196: checking for max_fail_percentage 11661 1726882393.91198: done checking for max_fail_percentage 11661 1726882393.91198: checking to see if all hosts have failed and the running result is not ok 11661 1726882393.91199: done checking to see if all hosts have failed 11661 1726882393.91200: getting the remaining hosts for this loop 11661 1726882393.91202: done getting the remaining hosts for this loop 11661 1726882393.91205: getting the next task for host managed_node2 11661 1726882393.91214: done getting next task for host managed_node2 11661 1726882393.91217: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11661 1726882393.91221: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882393.91227: getting variables 11661 1726882393.91229: in VariableManager get_vars() 11661 1726882393.91277: Calling all_inventory to load vars for managed_node2 11661 1726882393.91280: Calling groups_inventory to load vars for managed_node2 11661 1726882393.91283: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882393.91297: Calling all_plugins_play to load vars for managed_node2 11661 1726882393.91300: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882393.91304: Calling groups_plugins_play to load vars for managed_node2 11661 1726882393.93409: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000447 11661 1726882393.93412: WORKER PROCESS EXITING 11661 1726882393.94910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882393.97884: done with get_vars() 11661 1726882393.97912: done getting variables 11661 1726882393.97978: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882393.98800: variable 'profile' from source: include params 11661 1726882393.98804: variable 'item' from source: include params 11661 1726882393.98869: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:33:13 -0400 (0:00:00.104) 0:00:22.702 ****** 11661 1726882393.98900: entering _queue_task() for managed_node2/command 11661 1726882393.99218: worker is 1 (out of 1 available) 11661 1726882393.99232: exiting _queue_task() for managed_node2/command 11661 1726882393.99244: done queuing things up, now waiting for results queue to drain 11661 1726882393.99246: waiting for pending results... 11661 1726882394.00468: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0.1 11661 1726882394.00600: in run() - task 0e448fcc-3ce9-896b-2321-000000000448 11661 1726882394.00647: variable 'ansible_search_path' from source: unknown 11661 1726882394.00655: variable 'ansible_search_path' from source: unknown 11661 1726882394.00771: calling self._execute() 11661 1726882394.00896: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882394.01578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882394.01593: variable 'omit' from source: magic vars 11661 1726882394.01933: variable 'ansible_distribution_major_version' from source: facts 11661 1726882394.01950: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882394.02082: variable 'profile_stat' from source: set_fact 11661 1726882394.02104: Evaluated conditional (profile_stat.stat.exists): False 11661 1726882394.02113: when evaluation is False, skipping this task 11661 1726882394.02122: _execute() done 11661 1726882394.02130: dumping result to json 11661 1726882394.02140: done dumping result, returning 11661 1726882394.02152: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-bond0.1 [0e448fcc-3ce9-896b-2321-000000000448] 11661 1726882394.02165: sending task result for task 0e448fcc-3ce9-896b-2321-000000000448 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11661 1726882394.02315: no more pending results, returning what we have 11661 1726882394.02319: results queue empty 11661 1726882394.02320: checking for any_errors_fatal 11661 1726882394.02326: done checking for any_errors_fatal 11661 1726882394.02327: checking for max_fail_percentage 11661 1726882394.02329: done checking for max_fail_percentage 11661 1726882394.02329: checking to see if all hosts have failed and the running result is not ok 11661 1726882394.02330: done checking to see if all hosts have failed 11661 1726882394.02331: getting the remaining hosts for this loop 11661 1726882394.02332: done getting the remaining hosts for this loop 11661 1726882394.02336: getting the next task for host managed_node2 11661 1726882394.02342: done getting next task for host managed_node2 11661 1726882394.02344: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11661 1726882394.02348: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882394.02354: getting variables 11661 1726882394.02356: in VariableManager get_vars() 11661 1726882394.02397: Calling all_inventory to load vars for managed_node2 11661 1726882394.02400: Calling groups_inventory to load vars for managed_node2 11661 1726882394.02402: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882394.02415: Calling all_plugins_play to load vars for managed_node2 11661 1726882394.02417: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882394.02420: Calling groups_plugins_play to load vars for managed_node2 11661 1726882394.03487: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000448 11661 1726882394.03490: WORKER PROCESS EXITING 11661 1726882394.05148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882394.07280: done with get_vars() 11661 1726882394.07304: done getting variables 11661 1726882394.07393: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882394.07543: variable 'profile' from source: include params 11661 1726882394.07546: variable 'item' from source: include params 11661 1726882394.07614: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:33:14 -0400 (0:00:00.087) 0:00:22.789 ****** 11661 1726882394.07646: entering _queue_task() for managed_node2/set_fact 11661 1726882394.08048: worker is 1 (out of 1 available) 11661 1726882394.08062: exiting _queue_task() for managed_node2/set_fact 11661 1726882394.08081: done queuing things up, now waiting for results queue to drain 11661 1726882394.08082: waiting for pending results... 11661 1726882394.08955: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0.1 11661 1726882394.09220: in run() - task 0e448fcc-3ce9-896b-2321-000000000449 11661 1726882394.09285: variable 'ansible_search_path' from source: unknown 11661 1726882394.09296: variable 'ansible_search_path' from source: unknown 11661 1726882394.09343: calling self._execute() 11661 1726882394.09472: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882394.09484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882394.09500: variable 'omit' from source: magic vars 11661 1726882394.09908: variable 'ansible_distribution_major_version' from source: facts 11661 1726882394.09925: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882394.10048: variable 'profile_stat' from source: set_fact 11661 1726882394.10069: Evaluated conditional (profile_stat.stat.exists): False 11661 1726882394.10081: when evaluation is False, skipping this task 11661 1726882394.10087: _execute() done 11661 1726882394.10093: dumping result to json 11661 1726882394.10098: done dumping result, returning 11661 1726882394.10107: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [0e448fcc-3ce9-896b-2321-000000000449] 11661 1726882394.10115: sending task result for task 0e448fcc-3ce9-896b-2321-000000000449 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11661 1726882394.10257: no more pending results, returning what we have 11661 1726882394.10262: results queue empty 11661 1726882394.10262: checking for any_errors_fatal 11661 1726882394.10271: done checking for any_errors_fatal 11661 1726882394.10272: checking for max_fail_percentage 11661 1726882394.10273: done checking for max_fail_percentage 11661 1726882394.10274: checking to see if all hosts have failed and the running result is not ok 11661 1726882394.10275: done checking to see if all hosts have failed 11661 1726882394.10275: getting the remaining hosts for this loop 11661 1726882394.10277: done getting the remaining hosts for this loop 11661 1726882394.10281: getting the next task for host managed_node2 11661 1726882394.10289: done getting next task for host managed_node2 11661 1726882394.10293: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11661 1726882394.10296: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882394.10300: getting variables 11661 1726882394.10302: in VariableManager get_vars() 11661 1726882394.10342: Calling all_inventory to load vars for managed_node2 11661 1726882394.10345: Calling groups_inventory to load vars for managed_node2 11661 1726882394.10347: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882394.10361: Calling all_plugins_play to load vars for managed_node2 11661 1726882394.10365: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882394.10369: Calling groups_plugins_play to load vars for managed_node2 11661 1726882394.10889: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000449 11661 1726882394.10892: WORKER PROCESS EXITING 11661 1726882394.12353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882394.15337: done with get_vars() 11661 1726882394.15360: done getting variables 11661 1726882394.15421: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882394.15538: variable 'profile' from source: include params 11661 1726882394.15541: variable 'item' from source: include params 11661 1726882394.15602: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:33:14 -0400 (0:00:00.079) 0:00:22.869 ****** 11661 1726882394.15631: entering _queue_task() for managed_node2/assert 11661 1726882394.16056: worker is 1 (out of 1 available) 11661 1726882394.16072: exiting _queue_task() for managed_node2/assert 11661 1726882394.16620: done queuing things up, now waiting for results queue to drain 11661 1726882394.16622: waiting for pending results... 11661 1726882394.17249: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0.1' 11661 1726882394.17393: in run() - task 0e448fcc-3ce9-896b-2321-00000000026e 11661 1726882394.17415: variable 'ansible_search_path' from source: unknown 11661 1726882394.17421: variable 'ansible_search_path' from source: unknown 11661 1726882394.17462: calling self._execute() 11661 1726882394.17648: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882394.17662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882394.17681: variable 'omit' from source: magic vars 11661 1726882394.18111: variable 'ansible_distribution_major_version' from source: facts 11661 1726882394.18135: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882394.18146: variable 'omit' from source: magic vars 11661 1726882394.18193: variable 'omit' from source: magic vars 11661 1726882394.18388: variable 'profile' from source: include params 11661 1726882394.18398: variable 'item' from source: include params 11661 1726882394.18468: variable 'item' from source: include params 11661 1726882394.18491: variable 'omit' from source: magic vars 11661 1726882394.18539: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882394.18586: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882394.18615: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882394.18638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882394.18657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882394.18696: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882394.18744: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882394.18752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882394.18859: Set connection var ansible_connection to ssh 11661 1726882394.18897: Set connection var ansible_pipelining to False 11661 1726882394.19006: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882394.19019: Set connection var ansible_timeout to 10 11661 1726882394.19025: Set connection var ansible_shell_type to sh 11661 1726882394.19036: Set connection var ansible_shell_executable to /bin/sh 11661 1726882394.19065: variable 'ansible_shell_executable' from source: unknown 11661 1726882394.19073: variable 'ansible_connection' from source: unknown 11661 1726882394.19080: variable 'ansible_module_compression' from source: unknown 11661 1726882394.19086: variable 'ansible_shell_type' from source: unknown 11661 1726882394.19092: variable 'ansible_shell_executable' from source: unknown 11661 1726882394.19098: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882394.19110: variable 'ansible_pipelining' from source: unknown 11661 1726882394.19117: variable 'ansible_timeout' from source: unknown 11661 1726882394.19125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882394.19382: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882394.19445: variable 'omit' from source: magic vars 11661 1726882394.19456: starting attempt loop 11661 1726882394.19546: running the handler 11661 1726882394.19783: variable 'lsr_net_profile_exists' from source: set_fact 11661 1726882394.19794: Evaluated conditional (lsr_net_profile_exists): True 11661 1726882394.19805: handler run complete 11661 1726882394.19824: attempt loop complete, returning result 11661 1726882394.19830: _execute() done 11661 1726882394.19835: dumping result to json 11661 1726882394.19842: done dumping result, returning 11661 1726882394.19853: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'bond0.1' [0e448fcc-3ce9-896b-2321-00000000026e] 11661 1726882394.19868: sending task result for task 0e448fcc-3ce9-896b-2321-00000000026e ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11661 1726882394.20087: no more pending results, returning what we have 11661 1726882394.20112: results queue empty 11661 1726882394.20113: checking for any_errors_fatal 11661 1726882394.20123: done checking for any_errors_fatal 11661 1726882394.20124: checking for max_fail_percentage 11661 1726882394.20127: done checking for max_fail_percentage 11661 1726882394.20127: checking to see if all hosts have failed and the running result is not ok 11661 1726882394.20129: done checking to see if all hosts have failed 11661 1726882394.20130: getting the remaining hosts for this loop 11661 1726882394.20131: done getting the remaining hosts for this loop 11661 1726882394.20135: getting the next task for host managed_node2 11661 1726882394.20143: done getting next task for host managed_node2 11661 1726882394.20147: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11661 1726882394.20150: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882394.20154: getting variables 11661 1726882394.20156: in VariableManager get_vars() 11661 1726882394.20203: Calling all_inventory to load vars for managed_node2 11661 1726882394.20206: Calling groups_inventory to load vars for managed_node2 11661 1726882394.20209: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882394.20221: Calling all_plugins_play to load vars for managed_node2 11661 1726882394.20224: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882394.20228: Calling groups_plugins_play to load vars for managed_node2 11661 1726882394.21285: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000026e 11661 1726882394.21289: WORKER PROCESS EXITING 11661 1726882394.22516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882394.25270: done with get_vars() 11661 1726882394.25305: done getting variables 11661 1726882394.25371: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882394.25490: variable 'profile' from source: include params 11661 1726882394.25494: variable 'item' from source: include params 11661 1726882394.25556: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:33:14 -0400 (0:00:00.100) 0:00:22.970 ****** 11661 1726882394.25669: entering _queue_task() for managed_node2/assert 11661 1726882394.25997: worker is 1 (out of 1 available) 11661 1726882394.26009: exiting _queue_task() for managed_node2/assert 11661 1726882394.26021: done queuing things up, now waiting for results queue to drain 11661 1726882394.26023: waiting for pending results... 11661 1726882394.26310: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0.1' 11661 1726882394.26425: in run() - task 0e448fcc-3ce9-896b-2321-00000000026f 11661 1726882394.26448: variable 'ansible_search_path' from source: unknown 11661 1726882394.26456: variable 'ansible_search_path' from source: unknown 11661 1726882394.26501: calling self._execute() 11661 1726882394.26606: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882394.26618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882394.26632: variable 'omit' from source: magic vars 11661 1726882394.27002: variable 'ansible_distribution_major_version' from source: facts 11661 1726882394.27022: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882394.27034: variable 'omit' from source: magic vars 11661 1726882394.27079: variable 'omit' from source: magic vars 11661 1726882394.27181: variable 'profile' from source: include params 11661 1726882394.27191: variable 'item' from source: include params 11661 1726882394.27259: variable 'item' from source: include params 11661 1726882394.27286: variable 'omit' from source: magic vars 11661 1726882394.27336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882394.27378: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882394.27405: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882394.27429: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882394.27451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882394.27488: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882394.27497: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882394.27506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882394.27611: Set connection var ansible_connection to ssh 11661 1726882394.27623: Set connection var ansible_pipelining to False 11661 1726882394.27632: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882394.27644: Set connection var ansible_timeout to 10 11661 1726882394.27651: Set connection var ansible_shell_type to sh 11661 1726882394.27668: Set connection var ansible_shell_executable to /bin/sh 11661 1726882394.27694: variable 'ansible_shell_executable' from source: unknown 11661 1726882394.27701: variable 'ansible_connection' from source: unknown 11661 1726882394.27707: variable 'ansible_module_compression' from source: unknown 11661 1726882394.27714: variable 'ansible_shell_type' from source: unknown 11661 1726882394.27720: variable 'ansible_shell_executable' from source: unknown 11661 1726882394.27727: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882394.27735: variable 'ansible_pipelining' from source: unknown 11661 1726882394.27742: variable 'ansible_timeout' from source: unknown 11661 1726882394.27749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882394.27898: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882394.27916: variable 'omit' from source: magic vars 11661 1726882394.27927: starting attempt loop 11661 1726882394.27933: running the handler 11661 1726882394.28054: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11661 1726882394.28068: Evaluated conditional (lsr_net_profile_ansible_managed): True 11661 1726882394.28079: handler run complete 11661 1726882394.28102: attempt loop complete, returning result 11661 1726882394.28109: _execute() done 11661 1726882394.28116: dumping result to json 11661 1726882394.28123: done dumping result, returning 11661 1726882394.28135: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'bond0.1' [0e448fcc-3ce9-896b-2321-00000000026f] 11661 1726882394.28145: sending task result for task 0e448fcc-3ce9-896b-2321-00000000026f ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11661 1726882394.28300: no more pending results, returning what we have 11661 1726882394.28304: results queue empty 11661 1726882394.28305: checking for any_errors_fatal 11661 1726882394.28312: done checking for any_errors_fatal 11661 1726882394.28313: checking for max_fail_percentage 11661 1726882394.28315: done checking for max_fail_percentage 11661 1726882394.28316: checking to see if all hosts have failed and the running result is not ok 11661 1726882394.28317: done checking to see if all hosts have failed 11661 1726882394.28318: getting the remaining hosts for this loop 11661 1726882394.28319: done getting the remaining hosts for this loop 11661 1726882394.28323: getting the next task for host managed_node2 11661 1726882394.28330: done getting next task for host managed_node2 11661 1726882394.28333: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11661 1726882394.28336: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882394.28341: getting variables 11661 1726882394.28343: in VariableManager get_vars() 11661 1726882394.28390: Calling all_inventory to load vars for managed_node2 11661 1726882394.28393: Calling groups_inventory to load vars for managed_node2 11661 1726882394.28396: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882394.28407: Calling all_plugins_play to load vars for managed_node2 11661 1726882394.28411: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882394.28415: Calling groups_plugins_play to load vars for managed_node2 11661 1726882394.29383: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000026f 11661 1726882394.29387: WORKER PROCESS EXITING 11661 1726882394.30258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882394.31950: done with get_vars() 11661 1726882394.31982: done getting variables 11661 1726882394.32041: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882394.32155: variable 'profile' from source: include params 11661 1726882394.32159: variable 'item' from source: include params 11661 1726882394.32218: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:33:14 -0400 (0:00:00.065) 0:00:23.035 ****** 11661 1726882394.32254: entering _queue_task() for managed_node2/assert 11661 1726882394.32570: worker is 1 (out of 1 available) 11661 1726882394.32582: exiting _queue_task() for managed_node2/assert 11661 1726882394.32594: done queuing things up, now waiting for results queue to drain 11661 1726882394.32596: waiting for pending results... 11661 1726882394.32879: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0.1 11661 1726882394.32990: in run() - task 0e448fcc-3ce9-896b-2321-000000000270 11661 1726882394.33008: variable 'ansible_search_path' from source: unknown 11661 1726882394.33016: variable 'ansible_search_path' from source: unknown 11661 1726882394.33061: calling self._execute() 11661 1726882394.33175: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882394.33187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882394.33201: variable 'omit' from source: magic vars 11661 1726882394.33544: variable 'ansible_distribution_major_version' from source: facts 11661 1726882394.33562: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882394.33581: variable 'omit' from source: magic vars 11661 1726882394.33628: variable 'omit' from source: magic vars 11661 1726882394.33733: variable 'profile' from source: include params 11661 1726882394.33743: variable 'item' from source: include params 11661 1726882394.33816: variable 'item' from source: include params 11661 1726882394.33839: variable 'omit' from source: magic vars 11661 1726882394.33888: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882394.33936: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882394.33965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882394.33989: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882394.34007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882394.34044: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882394.34053: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882394.34061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882394.34168: Set connection var ansible_connection to ssh 11661 1726882394.34180: Set connection var ansible_pipelining to False 11661 1726882394.34191: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882394.34204: Set connection var ansible_timeout to 10 11661 1726882394.34210: Set connection var ansible_shell_type to sh 11661 1726882394.34222: Set connection var ansible_shell_executable to /bin/sh 11661 1726882394.34253: variable 'ansible_shell_executable' from source: unknown 11661 1726882394.34261: variable 'ansible_connection' from source: unknown 11661 1726882394.34272: variable 'ansible_module_compression' from source: unknown 11661 1726882394.34279: variable 'ansible_shell_type' from source: unknown 11661 1726882394.34285: variable 'ansible_shell_executable' from source: unknown 11661 1726882394.34292: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882394.34299: variable 'ansible_pipelining' from source: unknown 11661 1726882394.34307: variable 'ansible_timeout' from source: unknown 11661 1726882394.34314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882394.34461: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882394.34481: variable 'omit' from source: magic vars 11661 1726882394.34492: starting attempt loop 11661 1726882394.34499: running the handler 11661 1726882394.34613: variable 'lsr_net_profile_fingerprint' from source: set_fact 11661 1726882394.34624: Evaluated conditional (lsr_net_profile_fingerprint): True 11661 1726882394.34634: handler run complete 11661 1726882394.34653: attempt loop complete, returning result 11661 1726882394.34663: _execute() done 11661 1726882394.34674: dumping result to json 11661 1726882394.34683: done dumping result, returning 11661 1726882394.34695: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in bond0.1 [0e448fcc-3ce9-896b-2321-000000000270] 11661 1726882394.34705: sending task result for task 0e448fcc-3ce9-896b-2321-000000000270 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 11661 1726882394.34851: no more pending results, returning what we have 11661 1726882394.34855: results queue empty 11661 1726882394.34856: checking for any_errors_fatal 11661 1726882394.34866: done checking for any_errors_fatal 11661 1726882394.34867: checking for max_fail_percentage 11661 1726882394.34870: done checking for max_fail_percentage 11661 1726882394.34871: checking to see if all hosts have failed and the running result is not ok 11661 1726882394.34872: done checking to see if all hosts have failed 11661 1726882394.34873: getting the remaining hosts for this loop 11661 1726882394.34874: done getting the remaining hosts for this loop 11661 1726882394.34878: getting the next task for host managed_node2 11661 1726882394.34888: done getting next task for host managed_node2 11661 1726882394.34891: ^ task is: TASK: ** TEST check polling interval 11661 1726882394.34893: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882394.34898: getting variables 11661 1726882394.34900: in VariableManager get_vars() 11661 1726882394.34945: Calling all_inventory to load vars for managed_node2 11661 1726882394.34949: Calling groups_inventory to load vars for managed_node2 11661 1726882394.34952: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882394.34965: Calling all_plugins_play to load vars for managed_node2 11661 1726882394.34968: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882394.34972: Calling groups_plugins_play to load vars for managed_node2 11661 1726882394.35983: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000270 11661 1726882394.35987: WORKER PROCESS EXITING 11661 1726882394.36715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882394.38546: done with get_vars() 11661 1726882394.38571: done getting variables 11661 1726882394.38630: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:75 Friday 20 September 2024 21:33:14 -0400 (0:00:00.064) 0:00:23.100 ****** 11661 1726882394.38661: entering _queue_task() for managed_node2/command 11661 1726882394.38985: worker is 1 (out of 1 available) 11661 1726882394.38997: exiting _queue_task() for managed_node2/command 11661 1726882394.39010: done queuing things up, now waiting for results queue to drain 11661 1726882394.39012: waiting for pending results... 11661 1726882394.39590: running TaskExecutor() for managed_node2/TASK: ** TEST check polling interval 11661 1726882394.39688: in run() - task 0e448fcc-3ce9-896b-2321-000000000071 11661 1726882394.39708: variable 'ansible_search_path' from source: unknown 11661 1726882394.39746: calling self._execute() 11661 1726882394.39852: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882394.39866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882394.39886: variable 'omit' from source: magic vars 11661 1726882394.40591: variable 'ansible_distribution_major_version' from source: facts 11661 1726882394.40756: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882394.40769: variable 'omit' from source: magic vars 11661 1726882394.40794: variable 'omit' from source: magic vars 11661 1726882394.40901: variable 'controller_device' from source: play vars 11661 1726882394.40978: variable 'omit' from source: magic vars 11661 1726882394.41113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882394.41151: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882394.41235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882394.41306: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882394.41358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882394.41445: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882394.41565: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882394.41574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882394.41682: Set connection var ansible_connection to ssh 11661 1726882394.41754: Set connection var ansible_pipelining to False 11661 1726882394.41766: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882394.41778: Set connection var ansible_timeout to 10 11661 1726882394.41784: Set connection var ansible_shell_type to sh 11661 1726882394.41795: Set connection var ansible_shell_executable to /bin/sh 11661 1726882394.41857: variable 'ansible_shell_executable' from source: unknown 11661 1726882394.41869: variable 'ansible_connection' from source: unknown 11661 1726882394.41877: variable 'ansible_module_compression' from source: unknown 11661 1726882394.41883: variable 'ansible_shell_type' from source: unknown 11661 1726882394.41889: variable 'ansible_shell_executable' from source: unknown 11661 1726882394.41894: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882394.41901: variable 'ansible_pipelining' from source: unknown 11661 1726882394.41907: variable 'ansible_timeout' from source: unknown 11661 1726882394.41914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882394.42070: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882394.42088: variable 'omit' from source: magic vars 11661 1726882394.42097: starting attempt loop 11661 1726882394.42103: running the handler 11661 1726882394.42120: _low_level_execute_command(): starting 11661 1726882394.42131: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882394.42880: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882394.42898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882394.42914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882394.42938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882394.42985: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882394.42999: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882394.43013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882394.43034: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882394.43049: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882394.43062: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882394.43077: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882394.43092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882394.43108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882394.43120: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882394.43131: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882394.43148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882394.43225: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882394.43249: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882394.43274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882394.43419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882394.45114: stdout chunk (state=3): >>>/root <<< 11661 1726882394.45215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882394.45315: stderr chunk (state=3): >>><<< 11661 1726882394.45329: stdout chunk (state=3): >>><<< 11661 1726882394.45454: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882394.45457: _low_level_execute_command(): starting 11661 1726882394.45461: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882394.4536479-12710-158692727334410 `" && echo ansible-tmp-1726882394.4536479-12710-158692727334410="` echo /root/.ansible/tmp/ansible-tmp-1726882394.4536479-12710-158692727334410 `" ) && sleep 0' 11661 1726882394.46784: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882394.46788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882394.47017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882394.47021: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882394.47023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882394.47083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882394.47095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882394.47217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882394.49158: stdout chunk (state=3): >>>ansible-tmp-1726882394.4536479-12710-158692727334410=/root/.ansible/tmp/ansible-tmp-1726882394.4536479-12710-158692727334410 <<< 11661 1726882394.49256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882394.49343: stderr chunk (state=3): >>><<< 11661 1726882394.49346: stdout chunk (state=3): >>><<< 11661 1726882394.49573: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882394.4536479-12710-158692727334410=/root/.ansible/tmp/ansible-tmp-1726882394.4536479-12710-158692727334410 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882394.49576: variable 'ansible_module_compression' from source: unknown 11661 1726882394.49579: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11661 1726882394.49581: variable 'ansible_facts' from source: unknown 11661 1726882394.49588: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882394.4536479-12710-158692727334410/AnsiballZ_command.py 11661 1726882394.50253: Sending initial data 11661 1726882394.50257: Sent initial data (156 bytes) 11661 1726882394.52341: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882394.52345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882394.52348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882394.52361: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882394.52389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882394.52417: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882394.52441: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882394.52457: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882394.52473: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882394.52486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882394.52500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882394.52511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882394.52520: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882394.52532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882394.52627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882394.52654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882394.52675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882394.52819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882394.54655: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882394.54743: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882394.54840: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpekvv8793 /root/.ansible/tmp/ansible-tmp-1726882394.4536479-12710-158692727334410/AnsiballZ_command.py <<< 11661 1726882394.54936: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882394.56573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882394.56578: stderr chunk (state=3): >>><<< 11661 1726882394.56580: stdout chunk (state=3): >>><<< 11661 1726882394.56582: done transferring module to remote 11661 1726882394.56584: _low_level_execute_command(): starting 11661 1726882394.56586: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882394.4536479-12710-158692727334410/ /root/.ansible/tmp/ansible-tmp-1726882394.4536479-12710-158692727334410/AnsiballZ_command.py && sleep 0' 11661 1726882394.57533: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882394.57869: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882394.57897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882394.57920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882394.57977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882394.58004: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882394.58023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882394.58041: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882394.58056: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882394.58070: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882394.58083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882394.58096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882394.58111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882394.58126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882394.58137: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882394.58157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882394.58237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882394.58265: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882394.58283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882394.58415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882394.60554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882394.60697: stderr chunk (state=3): >>><<< 11661 1726882394.60709: stdout chunk (state=3): >>><<< 11661 1726882394.60775: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882394.60779: _low_level_execute_command(): starting 11661 1726882394.60782: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882394.4536479-12710-158692727334410/AnsiballZ_command.py && sleep 0' 11661 1726882394.61545: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882394.61565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882394.61579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882394.61599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882394.61645: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882394.61661: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882394.61678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882394.61696: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882394.61712: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882394.61733: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882394.61747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882394.61766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882394.61782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882394.61795: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882394.61806: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882394.61823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882394.61911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882394.61940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882394.61966: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882394.62117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882394.75518: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-20 21:33:14.749096", "end": "2024-09-20 21:33:14.752659", "delta": "0:00:00.003563", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11661 1726882394.76850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882394.76854: stdout chunk (state=3): >>><<< 11661 1726882394.76857: stderr chunk (state=3): >>><<< 11661 1726882394.76970: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-20 21:33:14.749096", "end": "2024-09-20 21:33:14.752659", "delta": "0:00:00.003563", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882394.76980: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/nm-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882394.4536479-12710-158692727334410/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882394.76983: _low_level_execute_command(): starting 11661 1726882394.76986: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882394.4536479-12710-158692727334410/ > /dev/null 2>&1 && sleep 0' 11661 1726882394.77653: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882394.77673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882394.77689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882394.77709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882394.77755: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882394.77776: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882394.77791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882394.77810: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882394.77823: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882394.77834: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882394.77847: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882394.77862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882394.77886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882394.77899: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882394.77911: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882394.77926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882394.78007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882394.78031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882394.78048: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882394.78186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882394.80061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882394.80140: stderr chunk (state=3): >>><<< 11661 1726882394.80152: stdout chunk (state=3): >>><<< 11661 1726882394.80369: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882394.80373: handler run complete 11661 1726882394.80375: Evaluated conditional (False): False 11661 1726882394.80377: variable 'result' from source: unknown 11661 1726882394.80399: Evaluated conditional ('110' in result.stdout): True 11661 1726882394.80415: attempt loop complete, returning result 11661 1726882394.80422: _execute() done 11661 1726882394.80429: dumping result to json 11661 1726882394.80438: done dumping result, returning 11661 1726882394.80449: done running TaskExecutor() for managed_node2/TASK: ** TEST check polling interval [0e448fcc-3ce9-896b-2321-000000000071] 11661 1726882394.80459: sending task result for task 0e448fcc-3ce9-896b-2321-000000000071 ok: [managed_node2] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/nm-bond" ], "delta": "0:00:00.003563", "end": "2024-09-20 21:33:14.752659", "rc": 0, "start": "2024-09-20 21:33:14.749096" } STDOUT: MII Polling Interval (ms): 110 11661 1726882394.80669: no more pending results, returning what we have 11661 1726882394.80673: results queue empty 11661 1726882394.80673: checking for any_errors_fatal 11661 1726882394.80679: done checking for any_errors_fatal 11661 1726882394.80679: checking for max_fail_percentage 11661 1726882394.80681: done checking for max_fail_percentage 11661 1726882394.80682: checking to see if all hosts have failed and the running result is not ok 11661 1726882394.80683: done checking to see if all hosts have failed 11661 1726882394.80684: getting the remaining hosts for this loop 11661 1726882394.80685: done getting the remaining hosts for this loop 11661 1726882394.80688: getting the next task for host managed_node2 11661 1726882394.80694: done getting next task for host managed_node2 11661 1726882394.80697: ^ task is: TASK: ** TEST check IPv4 11661 1726882394.80699: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882394.80702: getting variables 11661 1726882394.80704: in VariableManager get_vars() 11661 1726882394.80748: Calling all_inventory to load vars for managed_node2 11661 1726882394.80751: Calling groups_inventory to load vars for managed_node2 11661 1726882394.80754: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882394.80772: Calling all_plugins_play to load vars for managed_node2 11661 1726882394.80775: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882394.80780: Calling groups_plugins_play to load vars for managed_node2 11661 1726882394.81428: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000071 11661 1726882394.81431: WORKER PROCESS EXITING 11661 1726882394.82604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882394.84364: done with get_vars() 11661 1726882394.84397: done getting variables 11661 1726882394.84461: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:80 Friday 20 September 2024 21:33:14 -0400 (0:00:00.458) 0:00:23.558 ****** 11661 1726882394.84497: entering _queue_task() for managed_node2/command 11661 1726882394.84820: worker is 1 (out of 1 available) 11661 1726882394.84832: exiting _queue_task() for managed_node2/command 11661 1726882394.84843: done queuing things up, now waiting for results queue to drain 11661 1726882394.84845: waiting for pending results... 11661 1726882394.85130: running TaskExecutor() for managed_node2/TASK: ** TEST check IPv4 11661 1726882394.85229: in run() - task 0e448fcc-3ce9-896b-2321-000000000072 11661 1726882394.85249: variable 'ansible_search_path' from source: unknown 11661 1726882394.85292: calling self._execute() 11661 1726882394.85402: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882394.85413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882394.85427: variable 'omit' from source: magic vars 11661 1726882394.85805: variable 'ansible_distribution_major_version' from source: facts 11661 1726882394.85822: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882394.85839: variable 'omit' from source: magic vars 11661 1726882394.85866: variable 'omit' from source: magic vars 11661 1726882394.85970: variable 'controller_device' from source: play vars 11661 1726882394.85996: variable 'omit' from source: magic vars 11661 1726882394.86046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882394.86096: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882394.86121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882394.86140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882394.86154: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882394.86190: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882394.86201: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882394.86207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882394.86300: Set connection var ansible_connection to ssh 11661 1726882394.86318: Set connection var ansible_pipelining to False 11661 1726882394.86329: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882394.86341: Set connection var ansible_timeout to 10 11661 1726882394.86348: Set connection var ansible_shell_type to sh 11661 1726882394.86359: Set connection var ansible_shell_executable to /bin/sh 11661 1726882394.86392: variable 'ansible_shell_executable' from source: unknown 11661 1726882394.86399: variable 'ansible_connection' from source: unknown 11661 1726882394.86406: variable 'ansible_module_compression' from source: unknown 11661 1726882394.86414: variable 'ansible_shell_type' from source: unknown 11661 1726882394.86424: variable 'ansible_shell_executable' from source: unknown 11661 1726882394.86430: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882394.86437: variable 'ansible_pipelining' from source: unknown 11661 1726882394.86443: variable 'ansible_timeout' from source: unknown 11661 1726882394.86451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882394.86607: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882394.86624: variable 'omit' from source: magic vars 11661 1726882394.86639: starting attempt loop 11661 1726882394.86646: running the handler 11661 1726882394.86672: _low_level_execute_command(): starting 11661 1726882394.86686: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882394.87505: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882394.87523: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882394.87540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882394.87561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882394.87609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882394.87626: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882394.87640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882394.87660: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882394.87677: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882394.87691: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882394.87704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882394.87718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882394.87739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882394.87753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882394.87768: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882394.87783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882394.87866: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882394.87891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882394.87913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882394.88050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882394.89717: stdout chunk (state=3): >>>/root <<< 11661 1726882394.89832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882394.89955: stderr chunk (state=3): >>><<< 11661 1726882394.89980: stdout chunk (state=3): >>><<< 11661 1726882394.90120: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882394.90124: _low_level_execute_command(): starting 11661 1726882394.90127: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882394.9001641-12730-200204011276520 `" && echo ansible-tmp-1726882394.9001641-12730-200204011276520="` echo /root/.ansible/tmp/ansible-tmp-1726882394.9001641-12730-200204011276520 `" ) && sleep 0' 11661 1726882394.90869: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882394.90893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882394.90910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882394.90927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882394.90971: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882394.90983: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882394.91005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882394.91029: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882394.91043: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882394.91055: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882394.91076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882394.91096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882394.91125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882394.91140: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882394.91151: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882394.91172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882394.91283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882394.91312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882394.91331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882394.91480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882394.93412: stdout chunk (state=3): >>>ansible-tmp-1726882394.9001641-12730-200204011276520=/root/.ansible/tmp/ansible-tmp-1726882394.9001641-12730-200204011276520 <<< 11661 1726882394.93571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882394.93600: stderr chunk (state=3): >>><<< 11661 1726882394.93603: stdout chunk (state=3): >>><<< 11661 1726882394.93625: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882394.9001641-12730-200204011276520=/root/.ansible/tmp/ansible-tmp-1726882394.9001641-12730-200204011276520 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882394.93660: variable 'ansible_module_compression' from source: unknown 11661 1726882394.93716: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11661 1726882394.93753: variable 'ansible_facts' from source: unknown 11661 1726882394.93839: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882394.9001641-12730-200204011276520/AnsiballZ_command.py 11661 1726882394.93990: Sending initial data 11661 1726882394.93994: Sent initial data (156 bytes) 11661 1726882394.94991: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882394.94999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882394.95009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882394.95023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882394.95065: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882394.95072: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882394.95081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882394.95094: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882394.95100: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882394.95106: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882394.95113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882394.95122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882394.95135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882394.95140: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882394.95146: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882394.95158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882394.95227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882394.95242: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882394.95251: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882394.95379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882394.97167: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882394.97260: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882394.97358: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpxxo2d982 /root/.ansible/tmp/ansible-tmp-1726882394.9001641-12730-200204011276520/AnsiballZ_command.py <<< 11661 1726882394.97448: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882394.98872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882394.98876: stdout chunk (state=3): >>><<< 11661 1726882394.98882: stderr chunk (state=3): >>><<< 11661 1726882394.98903: done transferring module to remote 11661 1726882394.98915: _low_level_execute_command(): starting 11661 1726882394.98920: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882394.9001641-12730-200204011276520/ /root/.ansible/tmp/ansible-tmp-1726882394.9001641-12730-200204011276520/AnsiballZ_command.py && sleep 0' 11661 1726882394.99576: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882394.99586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882394.99596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882394.99611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882394.99649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882394.99656: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882394.99667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882394.99686: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882394.99693: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882394.99700: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882394.99708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882394.99717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882394.99729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882394.99737: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882394.99744: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882394.99756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882394.99826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882394.99843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882394.99861: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882394.99986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882395.01803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882395.01807: stdout chunk (state=3): >>><<< 11661 1726882395.01813: stderr chunk (state=3): >>><<< 11661 1726882395.01830: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882395.01834: _low_level_execute_command(): starting 11661 1726882395.01838: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882394.9001641-12730-200204011276520/AnsiballZ_command.py && sleep 0' 11661 1726882395.02821: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882395.02830: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882395.02841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882395.02856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882395.02894: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882395.02901: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882395.02911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882395.02924: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882395.02934: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882395.02939: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882395.02946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882395.02956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882395.02970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882395.02978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882395.02987: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882395.02994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882395.03064: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882395.03083: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882395.03095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882395.03233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882395.16672: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.141/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 233sec preferred_lft 233sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:15.160636", "end": "2024-09-20 21:33:15.164191", "delta": "0:00:00.003555", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11661 1726882395.17961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882395.17973: stdout chunk (state=3): >>><<< 11661 1726882395.17977: stderr chunk (state=3): >>><<< 11661 1726882395.17993: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.141/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 233sec preferred_lft 233sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:15.160636", "end": "2024-09-20 21:33:15.164191", "delta": "0:00:00.003555", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882395.18035: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882394.9001641-12730-200204011276520/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882395.18043: _low_level_execute_command(): starting 11661 1726882395.18049: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882394.9001641-12730-200204011276520/ > /dev/null 2>&1 && sleep 0' 11661 1726882395.19698: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882395.19702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882395.19875: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882395.19881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882395.19895: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 11661 1726882395.19900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882395.19906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882395.19911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882395.19916: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882395.19929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882395.20037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882395.20172: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882395.20179: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882395.20305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882395.22176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882395.22237: stderr chunk (state=3): >>><<< 11661 1726882395.22241: stdout chunk (state=3): >>><<< 11661 1726882395.22261: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882395.22272: handler run complete 11661 1726882395.22299: Evaluated conditional (False): False 11661 1726882395.22457: variable 'result' from source: set_fact 11661 1726882395.22475: Evaluated conditional ('192.0.2' in result.stdout): True 11661 1726882395.22487: attempt loop complete, returning result 11661 1726882395.22490: _execute() done 11661 1726882395.22494: dumping result to json 11661 1726882395.22500: done dumping result, returning 11661 1726882395.22507: done running TaskExecutor() for managed_node2/TASK: ** TEST check IPv4 [0e448fcc-3ce9-896b-2321-000000000072] 11661 1726882395.22513: sending task result for task 0e448fcc-3ce9-896b-2321-000000000072 11661 1726882395.22619: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000072 11661 1726882395.22621: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003555", "end": "2024-09-20 21:33:15.164191", "rc": 0, "start": "2024-09-20 21:33:15.160636" } STDOUT: 13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.141/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 233sec preferred_lft 233sec 11661 1726882395.22699: no more pending results, returning what we have 11661 1726882395.22703: results queue empty 11661 1726882395.22704: checking for any_errors_fatal 11661 1726882395.22714: done checking for any_errors_fatal 11661 1726882395.22714: checking for max_fail_percentage 11661 1726882395.22717: done checking for max_fail_percentage 11661 1726882395.22717: checking to see if all hosts have failed and the running result is not ok 11661 1726882395.22718: done checking to see if all hosts have failed 11661 1726882395.22719: getting the remaining hosts for this loop 11661 1726882395.22720: done getting the remaining hosts for this loop 11661 1726882395.22724: getting the next task for host managed_node2 11661 1726882395.22731: done getting next task for host managed_node2 11661 1726882395.22734: ^ task is: TASK: ** TEST check IPv6 11661 1726882395.22736: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882395.22740: getting variables 11661 1726882395.22741: in VariableManager get_vars() 11661 1726882395.22781: Calling all_inventory to load vars for managed_node2 11661 1726882395.22784: Calling groups_inventory to load vars for managed_node2 11661 1726882395.22786: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882395.22796: Calling all_plugins_play to load vars for managed_node2 11661 1726882395.22798: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882395.22801: Calling groups_plugins_play to load vars for managed_node2 11661 1726882395.25726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882395.29438: done with get_vars() 11661 1726882395.29483: done getting variables 11661 1726882395.29778: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:87 Friday 20 September 2024 21:33:15 -0400 (0:00:00.453) 0:00:24.011 ****** 11661 1726882395.29814: entering _queue_task() for managed_node2/command 11661 1726882395.30747: worker is 1 (out of 1 available) 11661 1726882395.30762: exiting _queue_task() for managed_node2/command 11661 1726882395.30776: done queuing things up, now waiting for results queue to drain 11661 1726882395.30778: waiting for pending results... 11661 1726882395.31761: running TaskExecutor() for managed_node2/TASK: ** TEST check IPv6 11661 1726882395.32073: in run() - task 0e448fcc-3ce9-896b-2321-000000000073 11661 1726882395.32270: variable 'ansible_search_path' from source: unknown 11661 1726882395.32316: calling self._execute() 11661 1726882395.33079: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882395.33160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882395.33176: variable 'omit' from source: magic vars 11661 1726882395.33527: variable 'ansible_distribution_major_version' from source: facts 11661 1726882395.33544: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882395.33556: variable 'omit' from source: magic vars 11661 1726882395.33584: variable 'omit' from source: magic vars 11661 1726882395.33682: variable 'controller_device' from source: play vars 11661 1726882395.33704: variable 'omit' from source: magic vars 11661 1726882395.33748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882395.34407: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882395.34434: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882395.34458: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882395.34478: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882395.34512: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882395.34520: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882395.34526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882395.34627: Set connection var ansible_connection to ssh 11661 1726882395.34639: Set connection var ansible_pipelining to False 11661 1726882395.34649: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882395.34667: Set connection var ansible_timeout to 10 11661 1726882395.34675: Set connection var ansible_shell_type to sh 11661 1726882395.34688: Set connection var ansible_shell_executable to /bin/sh 11661 1726882395.34715: variable 'ansible_shell_executable' from source: unknown 11661 1726882395.34723: variable 'ansible_connection' from source: unknown 11661 1726882395.34731: variable 'ansible_module_compression' from source: unknown 11661 1726882395.34737: variable 'ansible_shell_type' from source: unknown 11661 1726882395.34744: variable 'ansible_shell_executable' from source: unknown 11661 1726882395.34750: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882395.34757: variable 'ansible_pipelining' from source: unknown 11661 1726882395.34766: variable 'ansible_timeout' from source: unknown 11661 1726882395.34775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882395.34908: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882395.34925: variable 'omit' from source: magic vars 11661 1726882395.34933: starting attempt loop 11661 1726882395.34940: running the handler 11661 1726882395.34960: _low_level_execute_command(): starting 11661 1726882395.34975: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882395.36793: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882395.36799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882395.36819: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 11661 1726882395.36824: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882395.37015: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882395.37018: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882395.37149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882395.38838: stdout chunk (state=3): >>>/root <<< 11661 1726882395.38942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882395.39039: stderr chunk (state=3): >>><<< 11661 1726882395.39042: stdout chunk (state=3): >>><<< 11661 1726882395.39170: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882395.39173: _low_level_execute_command(): starting 11661 1726882395.39177: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882395.390711-12747-125630344851547 `" && echo ansible-tmp-1726882395.390711-12747-125630344851547="` echo /root/.ansible/tmp/ansible-tmp-1726882395.390711-12747-125630344851547 `" ) && sleep 0' 11661 1726882395.40228: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882395.40232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882395.40322: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882395.40326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882395.40340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882395.40536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882395.40539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882395.40541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882395.40658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882395.42600: stdout chunk (state=3): >>>ansible-tmp-1726882395.390711-12747-125630344851547=/root/.ansible/tmp/ansible-tmp-1726882395.390711-12747-125630344851547 <<< 11661 1726882395.42805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882395.42809: stdout chunk (state=3): >>><<< 11661 1726882395.42812: stderr chunk (state=3): >>><<< 11661 1726882395.43157: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882395.390711-12747-125630344851547=/root/.ansible/tmp/ansible-tmp-1726882395.390711-12747-125630344851547 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882395.43161: variable 'ansible_module_compression' from source: unknown 11661 1726882395.43165: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11661 1726882395.43167: variable 'ansible_facts' from source: unknown 11661 1726882395.43169: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882395.390711-12747-125630344851547/AnsiballZ_command.py 11661 1726882395.43234: Sending initial data 11661 1726882395.43244: Sent initial data (155 bytes) 11661 1726882395.45401: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882395.45406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882395.45439: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882395.45442: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882395.45445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882395.45515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882395.45519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882395.45521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882395.45635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882395.47447: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882395.47541: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882395.47641: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpk3n1s2nl /root/.ansible/tmp/ansible-tmp-1726882395.390711-12747-125630344851547/AnsiballZ_command.py <<< 11661 1726882395.47735: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882395.49054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882395.49269: stderr chunk (state=3): >>><<< 11661 1726882395.49273: stdout chunk (state=3): >>><<< 11661 1726882395.49275: done transferring module to remote 11661 1726882395.49277: _low_level_execute_command(): starting 11661 1726882395.49361: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882395.390711-12747-125630344851547/ /root/.ansible/tmp/ansible-tmp-1726882395.390711-12747-125630344851547/AnsiballZ_command.py && sleep 0' 11661 1726882395.50025: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882395.50038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882395.50054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882395.50082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882395.50122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882395.50139: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882395.50155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882395.50175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882395.50187: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882395.50196: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882395.50207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882395.50219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882395.50233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882395.50249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882395.50266: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882395.50280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882395.50365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882395.50388: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882395.50403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882395.50530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882395.52402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882395.52405: stdout chunk (state=3): >>><<< 11661 1726882395.52407: stderr chunk (state=3): >>><<< 11661 1726882395.52506: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882395.52509: _low_level_execute_command(): starting 11661 1726882395.52512: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882395.390711-12747-125630344851547/AnsiballZ_command.py && sleep 0' 11661 1726882395.53521: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882395.53534: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882395.53545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882395.53566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882395.53608: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882395.53622: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882395.53633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882395.53647: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882395.53659: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882395.53670: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882395.53679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882395.53689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882395.53701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882395.53710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882395.53722: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882395.53735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882395.53815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882395.53844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882395.53867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882395.54006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882395.67507: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::1ab/128 scope global dynamic noprefixroute \n valid_lft 235sec preferred_lft 235sec\n inet6 2001:db8::6334:ee20:a255:f6a0/64 scope global dynamic noprefixroute \n valid_lft 1799sec preferred_lft 1799sec\n inet6 fe80::e6a6:202b:1406:4246/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:15.669240", "end": "2024-09-20 21:33:15.672721", "delta": "0:00:00.003481", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11661 1726882395.68684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882395.68730: stderr chunk (state=3): >>><<< 11661 1726882395.68733: stdout chunk (state=3): >>><<< 11661 1726882395.68749: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::1ab/128 scope global dynamic noprefixroute \n valid_lft 235sec preferred_lft 235sec\n inet6 2001:db8::6334:ee20:a255:f6a0/64 scope global dynamic noprefixroute \n valid_lft 1799sec preferred_lft 1799sec\n inet6 fe80::e6a6:202b:1406:4246/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 21:33:15.669240", "end": "2024-09-20 21:33:15.672721", "delta": "0:00:00.003481", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882395.68786: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882395.390711-12747-125630344851547/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882395.68795: _low_level_execute_command(): starting 11661 1726882395.68798: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882395.390711-12747-125630344851547/ > /dev/null 2>&1 && sleep 0' 11661 1726882395.69252: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882395.69261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882395.69301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882395.69304: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882395.69306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882395.69352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882395.69375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882395.69381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882395.69482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882395.71286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882395.71361: stderr chunk (state=3): >>><<< 11661 1726882395.71369: stdout chunk (state=3): >>><<< 11661 1726882395.71388: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882395.71399: handler run complete 11661 1726882395.71425: Evaluated conditional (False): False 11661 1726882395.71575: variable 'result' from source: set_fact 11661 1726882395.71590: Evaluated conditional ('2001' in result.stdout): True 11661 1726882395.71601: attempt loop complete, returning result 11661 1726882395.71604: _execute() done 11661 1726882395.71606: dumping result to json 11661 1726882395.71616: done dumping result, returning 11661 1726882395.71626: done running TaskExecutor() for managed_node2/TASK: ** TEST check IPv6 [0e448fcc-3ce9-896b-2321-000000000073] 11661 1726882395.71632: sending task result for task 0e448fcc-3ce9-896b-2321-000000000073 11661 1726882395.71741: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000073 11661 1726882395.71744: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003481", "end": "2024-09-20 21:33:15.672721", "rc": 0, "start": "2024-09-20 21:33:15.669240" } STDOUT: 13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::1ab/128 scope global dynamic noprefixroute valid_lft 235sec preferred_lft 235sec inet6 2001:db8::6334:ee20:a255:f6a0/64 scope global dynamic noprefixroute valid_lft 1799sec preferred_lft 1799sec inet6 fe80::e6a6:202b:1406:4246/64 scope link noprefixroute valid_lft forever preferred_lft forever 11661 1726882395.71817: no more pending results, returning what we have 11661 1726882395.71821: results queue empty 11661 1726882395.71822: checking for any_errors_fatal 11661 1726882395.71831: done checking for any_errors_fatal 11661 1726882395.71832: checking for max_fail_percentage 11661 1726882395.71834: done checking for max_fail_percentage 11661 1726882395.71834: checking to see if all hosts have failed and the running result is not ok 11661 1726882395.71835: done checking to see if all hosts have failed 11661 1726882395.71836: getting the remaining hosts for this loop 11661 1726882395.71837: done getting the remaining hosts for this loop 11661 1726882395.71841: getting the next task for host managed_node2 11661 1726882395.71852: done getting next task for host managed_node2 11661 1726882395.71858: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11661 1726882395.71862: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882395.71882: getting variables 11661 1726882395.71884: in VariableManager get_vars() 11661 1726882395.71921: Calling all_inventory to load vars for managed_node2 11661 1726882395.71924: Calling groups_inventory to load vars for managed_node2 11661 1726882395.71925: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882395.71934: Calling all_plugins_play to load vars for managed_node2 11661 1726882395.71936: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882395.71939: Calling groups_plugins_play to load vars for managed_node2 11661 1726882395.72774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882395.73693: done with get_vars() 11661 1726882395.73710: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:33:15 -0400 (0:00:00.439) 0:00:24.451 ****** 11661 1726882395.73784: entering _queue_task() for managed_node2/include_tasks 11661 1726882395.74010: worker is 1 (out of 1 available) 11661 1726882395.74023: exiting _queue_task() for managed_node2/include_tasks 11661 1726882395.74035: done queuing things up, now waiting for results queue to drain 11661 1726882395.74037: waiting for pending results... 11661 1726882395.74217: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11661 1726882395.74318: in run() - task 0e448fcc-3ce9-896b-2321-00000000007c 11661 1726882395.74329: variable 'ansible_search_path' from source: unknown 11661 1726882395.74332: variable 'ansible_search_path' from source: unknown 11661 1726882395.74363: calling self._execute() 11661 1726882395.74434: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882395.74438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882395.74448: variable 'omit' from source: magic vars 11661 1726882395.74711: variable 'ansible_distribution_major_version' from source: facts 11661 1726882395.74721: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882395.74730: _execute() done 11661 1726882395.74733: dumping result to json 11661 1726882395.74736: done dumping result, returning 11661 1726882395.74743: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-896b-2321-00000000007c] 11661 1726882395.74747: sending task result for task 0e448fcc-3ce9-896b-2321-00000000007c 11661 1726882395.74834: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000007c 11661 1726882395.74837: WORKER PROCESS EXITING 11661 1726882395.74883: no more pending results, returning what we have 11661 1726882395.74888: in VariableManager get_vars() 11661 1726882395.74931: Calling all_inventory to load vars for managed_node2 11661 1726882395.74934: Calling groups_inventory to load vars for managed_node2 11661 1726882395.74936: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882395.74950: Calling all_plugins_play to load vars for managed_node2 11661 1726882395.74953: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882395.74956: Calling groups_plugins_play to load vars for managed_node2 11661 1726882395.75858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882395.76810: done with get_vars() 11661 1726882395.76826: variable 'ansible_search_path' from source: unknown 11661 1726882395.76827: variable 'ansible_search_path' from source: unknown 11661 1726882395.76858: we have included files to process 11661 1726882395.76859: generating all_blocks data 11661 1726882395.76861: done generating all_blocks data 11661 1726882395.76866: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11661 1726882395.76866: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11661 1726882395.76868: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11661 1726882395.77250: done processing included file 11661 1726882395.77252: iterating over new_blocks loaded from include file 11661 1726882395.77253: in VariableManager get_vars() 11661 1726882395.77273: done with get_vars() 11661 1726882395.77274: filtering new block on tags 11661 1726882395.77294: done filtering new block on tags 11661 1726882395.77296: in VariableManager get_vars() 11661 1726882395.77310: done with get_vars() 11661 1726882395.77311: filtering new block on tags 11661 1726882395.77335: done filtering new block on tags 11661 1726882395.77344: in VariableManager get_vars() 11661 1726882395.77369: done with get_vars() 11661 1726882395.77371: filtering new block on tags 11661 1726882395.77406: done filtering new block on tags 11661 1726882395.77408: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 11661 1726882395.77413: extending task lists for all hosts with included blocks 11661 1726882395.78317: done extending task lists 11661 1726882395.78318: done processing included files 11661 1726882395.78319: results queue empty 11661 1726882395.78320: checking for any_errors_fatal 11661 1726882395.78324: done checking for any_errors_fatal 11661 1726882395.78325: checking for max_fail_percentage 11661 1726882395.78326: done checking for max_fail_percentage 11661 1726882395.78327: checking to see if all hosts have failed and the running result is not ok 11661 1726882395.78327: done checking to see if all hosts have failed 11661 1726882395.78328: getting the remaining hosts for this loop 11661 1726882395.78329: done getting the remaining hosts for this loop 11661 1726882395.78331: getting the next task for host managed_node2 11661 1726882395.78336: done getting next task for host managed_node2 11661 1726882395.78339: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11661 1726882395.78342: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882395.78353: getting variables 11661 1726882395.78354: in VariableManager get_vars() 11661 1726882395.78371: Calling all_inventory to load vars for managed_node2 11661 1726882395.78374: Calling groups_inventory to load vars for managed_node2 11661 1726882395.78376: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882395.78380: Calling all_plugins_play to load vars for managed_node2 11661 1726882395.78383: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882395.78385: Calling groups_plugins_play to load vars for managed_node2 11661 1726882395.79541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882395.80471: done with get_vars() 11661 1726882395.80495: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:33:15 -0400 (0:00:00.067) 0:00:24.519 ****** 11661 1726882395.80557: entering _queue_task() for managed_node2/setup 11661 1726882395.80807: worker is 1 (out of 1 available) 11661 1726882395.80822: exiting _queue_task() for managed_node2/setup 11661 1726882395.80834: done queuing things up, now waiting for results queue to drain 11661 1726882395.80836: waiting for pending results... 11661 1726882395.81030: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11661 1726882395.81138: in run() - task 0e448fcc-3ce9-896b-2321-000000000491 11661 1726882395.81148: variable 'ansible_search_path' from source: unknown 11661 1726882395.81151: variable 'ansible_search_path' from source: unknown 11661 1726882395.81186: calling self._execute() 11661 1726882395.81261: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882395.81267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882395.81277: variable 'omit' from source: magic vars 11661 1726882395.81546: variable 'ansible_distribution_major_version' from source: facts 11661 1726882395.81559: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882395.81710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882395.83304: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882395.83358: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882395.83387: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882395.83413: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882395.83433: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882395.83495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882395.83517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882395.83534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882395.83562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882395.83575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882395.83613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882395.83628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882395.83644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882395.83675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882395.83685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882395.83798: variable '__network_required_facts' from source: role '' defaults 11661 1726882395.83805: variable 'ansible_facts' from source: unknown 11661 1726882395.84265: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11661 1726882395.84271: when evaluation is False, skipping this task 11661 1726882395.84273: _execute() done 11661 1726882395.84276: dumping result to json 11661 1726882395.84278: done dumping result, returning 11661 1726882395.84286: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-896b-2321-000000000491] 11661 1726882395.84290: sending task result for task 0e448fcc-3ce9-896b-2321-000000000491 11661 1726882395.84377: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000491 11661 1726882395.84379: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11661 1726882395.84458: no more pending results, returning what we have 11661 1726882395.84463: results queue empty 11661 1726882395.84466: checking for any_errors_fatal 11661 1726882395.84467: done checking for any_errors_fatal 11661 1726882395.84468: checking for max_fail_percentage 11661 1726882395.84469: done checking for max_fail_percentage 11661 1726882395.84470: checking to see if all hosts have failed and the running result is not ok 11661 1726882395.84471: done checking to see if all hosts have failed 11661 1726882395.84472: getting the remaining hosts for this loop 11661 1726882395.84473: done getting the remaining hosts for this loop 11661 1726882395.84477: getting the next task for host managed_node2 11661 1726882395.84486: done getting next task for host managed_node2 11661 1726882395.84494: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11661 1726882395.84500: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882395.84516: getting variables 11661 1726882395.84517: in VariableManager get_vars() 11661 1726882395.84557: Calling all_inventory to load vars for managed_node2 11661 1726882395.84559: Calling groups_inventory to load vars for managed_node2 11661 1726882395.84561: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882395.84572: Calling all_plugins_play to load vars for managed_node2 11661 1726882395.84574: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882395.84576: Calling groups_plugins_play to load vars for managed_node2 11661 1726882395.85388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882395.86313: done with get_vars() 11661 1726882395.86329: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:33:15 -0400 (0:00:00.058) 0:00:24.577 ****** 11661 1726882395.86404: entering _queue_task() for managed_node2/stat 11661 1726882395.86618: worker is 1 (out of 1 available) 11661 1726882395.86631: exiting _queue_task() for managed_node2/stat 11661 1726882395.86643: done queuing things up, now waiting for results queue to drain 11661 1726882395.86645: waiting for pending results... 11661 1726882395.86828: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 11661 1726882395.86937: in run() - task 0e448fcc-3ce9-896b-2321-000000000493 11661 1726882395.86947: variable 'ansible_search_path' from source: unknown 11661 1726882395.86950: variable 'ansible_search_path' from source: unknown 11661 1726882395.86982: calling self._execute() 11661 1726882395.87115: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882395.87118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882395.87127: variable 'omit' from source: magic vars 11661 1726882395.87384: variable 'ansible_distribution_major_version' from source: facts 11661 1726882395.87394: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882395.87508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11661 1726882395.87713: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11661 1726882395.87744: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11661 1726882395.87772: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11661 1726882395.87798: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11661 1726882395.87862: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11661 1726882395.87880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11661 1726882395.87898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882395.87916: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11661 1726882395.87980: variable '__network_is_ostree' from source: set_fact 11661 1726882395.87985: Evaluated conditional (not __network_is_ostree is defined): False 11661 1726882395.87990: when evaluation is False, skipping this task 11661 1726882395.87993: _execute() done 11661 1726882395.87996: dumping result to json 11661 1726882395.87999: done dumping result, returning 11661 1726882395.88006: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-896b-2321-000000000493] 11661 1726882395.88011: sending task result for task 0e448fcc-3ce9-896b-2321-000000000493 11661 1726882395.88092: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000493 11661 1726882395.88094: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11661 1726882395.88140: no more pending results, returning what we have 11661 1726882395.88144: results queue empty 11661 1726882395.88145: checking for any_errors_fatal 11661 1726882395.88153: done checking for any_errors_fatal 11661 1726882395.88154: checking for max_fail_percentage 11661 1726882395.88156: done checking for max_fail_percentage 11661 1726882395.88157: checking to see if all hosts have failed and the running result is not ok 11661 1726882395.88158: done checking to see if all hosts have failed 11661 1726882395.88158: getting the remaining hosts for this loop 11661 1726882395.88160: done getting the remaining hosts for this loop 11661 1726882395.88163: getting the next task for host managed_node2 11661 1726882395.88173: done getting next task for host managed_node2 11661 1726882395.88176: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11661 1726882395.88181: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882395.88200: getting variables 11661 1726882395.88201: in VariableManager get_vars() 11661 1726882395.88235: Calling all_inventory to load vars for managed_node2 11661 1726882395.88237: Calling groups_inventory to load vars for managed_node2 11661 1726882395.88239: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882395.88247: Calling all_plugins_play to load vars for managed_node2 11661 1726882395.88250: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882395.88252: Calling groups_plugins_play to load vars for managed_node2 11661 1726882395.89881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882395.90815: done with get_vars() 11661 1726882395.90832: done getting variables 11661 1726882395.90880: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:33:15 -0400 (0:00:00.045) 0:00:24.622 ****** 11661 1726882395.90907: entering _queue_task() for managed_node2/set_fact 11661 1726882395.91142: worker is 1 (out of 1 available) 11661 1726882395.91154: exiting _queue_task() for managed_node2/set_fact 11661 1726882395.91167: done queuing things up, now waiting for results queue to drain 11661 1726882395.91169: waiting for pending results... 11661 1726882395.91353: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11661 1726882395.91471: in run() - task 0e448fcc-3ce9-896b-2321-000000000494 11661 1726882395.91484: variable 'ansible_search_path' from source: unknown 11661 1726882395.91488: variable 'ansible_search_path' from source: unknown 11661 1726882395.91516: calling self._execute() 11661 1726882395.91591: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882395.91594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882395.91604: variable 'omit' from source: magic vars 11661 1726882395.92040: variable 'ansible_distribution_major_version' from source: facts 11661 1726882395.92194: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882395.92371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11661 1726882395.92654: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11661 1726882395.92703: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11661 1726882395.92744: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11661 1726882395.92784: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11661 1726882395.92870: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11661 1726882395.92898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11661 1726882395.92923: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882395.92959: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11661 1726882395.93054: variable '__network_is_ostree' from source: set_fact 11661 1726882395.93069: Evaluated conditional (not __network_is_ostree is defined): False 11661 1726882395.93077: when evaluation is False, skipping this task 11661 1726882395.93084: _execute() done 11661 1726882395.93090: dumping result to json 11661 1726882395.93097: done dumping result, returning 11661 1726882395.93107: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-896b-2321-000000000494] 11661 1726882395.93117: sending task result for task 0e448fcc-3ce9-896b-2321-000000000494 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11661 1726882395.93257: no more pending results, returning what we have 11661 1726882395.93261: results queue empty 11661 1726882395.93262: checking for any_errors_fatal 11661 1726882395.93271: done checking for any_errors_fatal 11661 1726882395.93272: checking for max_fail_percentage 11661 1726882395.93274: done checking for max_fail_percentage 11661 1726882395.93275: checking to see if all hosts have failed and the running result is not ok 11661 1726882395.93275: done checking to see if all hosts have failed 11661 1726882395.93276: getting the remaining hosts for this loop 11661 1726882395.93278: done getting the remaining hosts for this loop 11661 1726882395.93281: getting the next task for host managed_node2 11661 1726882395.93292: done getting next task for host managed_node2 11661 1726882395.93296: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11661 1726882395.93301: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882395.93322: getting variables 11661 1726882395.93324: in VariableManager get_vars() 11661 1726882395.93368: Calling all_inventory to load vars for managed_node2 11661 1726882395.93370: Calling groups_inventory to load vars for managed_node2 11661 1726882395.93373: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882395.93383: Calling all_plugins_play to load vars for managed_node2 11661 1726882395.93385: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882395.93389: Calling groups_plugins_play to load vars for managed_node2 11661 1726882395.94383: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000494 11661 1726882395.94386: WORKER PROCESS EXITING 11661 1726882395.95137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882395.97116: done with get_vars() 11661 1726882395.97144: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:33:15 -0400 (0:00:00.063) 0:00:24.685 ****** 11661 1726882395.97249: entering _queue_task() for managed_node2/service_facts 11661 1726882395.97612: worker is 1 (out of 1 available) 11661 1726882395.97625: exiting _queue_task() for managed_node2/service_facts 11661 1726882395.97637: done queuing things up, now waiting for results queue to drain 11661 1726882395.97638: waiting for pending results... 11661 1726882395.97970: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 11661 1726882395.98130: in run() - task 0e448fcc-3ce9-896b-2321-000000000496 11661 1726882395.98148: variable 'ansible_search_path' from source: unknown 11661 1726882395.98154: variable 'ansible_search_path' from source: unknown 11661 1726882395.98203: calling self._execute() 11661 1726882395.98319: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882395.98335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882395.98351: variable 'omit' from source: magic vars 11661 1726882395.98765: variable 'ansible_distribution_major_version' from source: facts 11661 1726882395.98787: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882395.98805: variable 'omit' from source: magic vars 11661 1726882395.98911: variable 'omit' from source: magic vars 11661 1726882395.99001: variable 'omit' from source: magic vars 11661 1726882395.99120: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882395.99227: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882395.99317: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882395.99343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882395.99367: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882395.99431: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882395.99515: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882395.99524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882395.99766: Set connection var ansible_connection to ssh 11661 1726882395.99782: Set connection var ansible_pipelining to False 11661 1726882395.99799: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882395.99847: Set connection var ansible_timeout to 10 11661 1726882395.99856: Set connection var ansible_shell_type to sh 11661 1726882395.99871: Set connection var ansible_shell_executable to /bin/sh 11661 1726882395.99973: variable 'ansible_shell_executable' from source: unknown 11661 1726882395.99985: variable 'ansible_connection' from source: unknown 11661 1726882395.99995: variable 'ansible_module_compression' from source: unknown 11661 1726882396.00002: variable 'ansible_shell_type' from source: unknown 11661 1726882396.00009: variable 'ansible_shell_executable' from source: unknown 11661 1726882396.00015: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882396.00022: variable 'ansible_pipelining' from source: unknown 11661 1726882396.00028: variable 'ansible_timeout' from source: unknown 11661 1726882396.00035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882396.00505: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11661 1726882396.00613: variable 'omit' from source: magic vars 11661 1726882396.00626: starting attempt loop 11661 1726882396.00633: running the handler 11661 1726882396.00658: _low_level_execute_command(): starting 11661 1726882396.00678: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882396.02624: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882396.02628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882396.02760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882396.02766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882396.02770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882396.02857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882396.02885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882396.02933: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882396.03091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882396.04711: stdout chunk (state=3): >>>/root <<< 11661 1726882396.04813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882396.04912: stderr chunk (state=3): >>><<< 11661 1726882396.04915: stdout chunk (state=3): >>><<< 11661 1726882396.05038: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882396.05041: _low_level_execute_command(): starting 11661 1726882396.05044: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882396.049347-12791-189981732088525 `" && echo ansible-tmp-1726882396.049347-12791-189981732088525="` echo /root/.ansible/tmp/ansible-tmp-1726882396.049347-12791-189981732088525 `" ) && sleep 0' 11661 1726882396.05706: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882396.05720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882396.05734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882396.05757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882396.05804: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882396.05824: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882396.05839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882396.05858: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882396.05875: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882396.05888: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882396.05901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882396.05919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882396.05936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882396.05949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882396.05962: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882396.05979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882396.06057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882396.06077: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882396.06092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882396.06372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882396.08145: stdout chunk (state=3): >>>ansible-tmp-1726882396.049347-12791-189981732088525=/root/.ansible/tmp/ansible-tmp-1726882396.049347-12791-189981732088525 <<< 11661 1726882396.08255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882396.08334: stderr chunk (state=3): >>><<< 11661 1726882396.08338: stdout chunk (state=3): >>><<< 11661 1726882396.08671: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882396.049347-12791-189981732088525=/root/.ansible/tmp/ansible-tmp-1726882396.049347-12791-189981732088525 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882396.08675: variable 'ansible_module_compression' from source: unknown 11661 1726882396.08677: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 11661 1726882396.08680: variable 'ansible_facts' from source: unknown 11661 1726882396.08682: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882396.049347-12791-189981732088525/AnsiballZ_service_facts.py 11661 1726882396.09023: Sending initial data 11661 1726882396.09026: Sent initial data (161 bytes) 11661 1726882396.10807: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882396.10811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882396.10843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882396.10848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882396.10852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882396.10909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882396.11191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882396.11196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882396.11307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882396.13083: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882396.13183: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882396.13285: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmp_0jh59le /root/.ansible/tmp/ansible-tmp-1726882396.049347-12791-189981732088525/AnsiballZ_service_facts.py <<< 11661 1726882396.13386: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882396.14816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882396.14969: stderr chunk (state=3): >>><<< 11661 1726882396.15090: stdout chunk (state=3): >>><<< 11661 1726882396.15093: done transferring module to remote 11661 1726882396.15095: _low_level_execute_command(): starting 11661 1726882396.15097: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882396.049347-12791-189981732088525/ /root/.ansible/tmp/ansible-tmp-1726882396.049347-12791-189981732088525/AnsiballZ_service_facts.py && sleep 0' 11661 1726882396.17349: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882396.17353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882396.17390: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882396.17393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882396.17400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882396.17464: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882396.17469: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882396.17472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882396.17592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882396.19432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882396.19513: stderr chunk (state=3): >>><<< 11661 1726882396.19516: stdout chunk (state=3): >>><<< 11661 1726882396.19611: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882396.19614: _low_level_execute_command(): starting 11661 1726882396.19617: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882396.049347-12791-189981732088525/AnsiballZ_service_facts.py && sleep 0' 11661 1726882396.20526: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882396.20539: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882396.20550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882396.20569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882396.20607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882396.20614: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882396.20624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882396.20639: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882396.20651: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882396.20661: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882396.20673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882396.20683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882396.20695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882396.20702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882396.20709: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882396.20718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882396.20799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882396.20818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882396.20830: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882396.20969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882397.54632: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "sta<<< 11661 1726882397.54678: stdout chunk (state=3): >>>tic", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-d<<< 11661 1726882397.54685: stdout chunk (state=3): >>>isable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11661 1726882397.55998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882397.56002: stdout chunk (state=3): >>><<< 11661 1726882397.56009: stderr chunk (state=3): >>><<< 11661 1726882397.56035: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882397.68516: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882396.049347-12791-189981732088525/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882397.68521: _low_level_execute_command(): starting 11661 1726882397.68526: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882396.049347-12791-189981732088525/ > /dev/null 2>&1 && sleep 0' 11661 1726882397.70042: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882397.70048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882397.70059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882397.70083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882397.70118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882397.70125: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882397.70134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882397.70147: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882397.70156: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882397.70160: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882397.70170: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882397.70183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882397.70194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882397.70201: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882397.70208: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882397.70217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882397.70294: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882397.70312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882397.70324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882397.70455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882397.72545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882397.72549: stdout chunk (state=3): >>><<< 11661 1726882397.72553: stderr chunk (state=3): >>><<< 11661 1726882397.72556: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882397.72558: handler run complete 11661 1726882397.72595: variable 'ansible_facts' from source: unknown 11661 1726882397.72702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882397.73162: variable 'ansible_facts' from source: unknown 11661 1726882397.73198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882397.73400: attempt loop complete, returning result 11661 1726882397.73403: _execute() done 11661 1726882397.73406: dumping result to json 11661 1726882397.73461: done dumping result, returning 11661 1726882397.73480: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-896b-2321-000000000496] 11661 1726882397.73485: sending task result for task 0e448fcc-3ce9-896b-2321-000000000496 11661 1726882397.79119: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000496 11661 1726882397.79130: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11661 1726882397.79200: no more pending results, returning what we have 11661 1726882397.79202: results queue empty 11661 1726882397.79203: checking for any_errors_fatal 11661 1726882397.79206: done checking for any_errors_fatal 11661 1726882397.79206: checking for max_fail_percentage 11661 1726882397.79207: done checking for max_fail_percentage 11661 1726882397.79208: checking to see if all hosts have failed and the running result is not ok 11661 1726882397.79209: done checking to see if all hosts have failed 11661 1726882397.79210: getting the remaining hosts for this loop 11661 1726882397.79211: done getting the remaining hosts for this loop 11661 1726882397.79213: getting the next task for host managed_node2 11661 1726882397.79218: done getting next task for host managed_node2 11661 1726882397.79221: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11661 1726882397.79229: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882397.79241: getting variables 11661 1726882397.79243: in VariableManager get_vars() 11661 1726882397.79271: Calling all_inventory to load vars for managed_node2 11661 1726882397.79273: Calling groups_inventory to load vars for managed_node2 11661 1726882397.79278: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882397.79283: Calling all_plugins_play to load vars for managed_node2 11661 1726882397.79284: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882397.79286: Calling groups_plugins_play to load vars for managed_node2 11661 1726882397.80002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882397.81011: done with get_vars() 11661 1726882397.81028: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:33:17 -0400 (0:00:01.838) 0:00:26.524 ****** 11661 1726882397.81091: entering _queue_task() for managed_node2/package_facts 11661 1726882397.81325: worker is 1 (out of 1 available) 11661 1726882397.81341: exiting _queue_task() for managed_node2/package_facts 11661 1726882397.81352: done queuing things up, now waiting for results queue to drain 11661 1726882397.81354: waiting for pending results... 11661 1726882397.81547: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 11661 1726882397.81665: in run() - task 0e448fcc-3ce9-896b-2321-000000000497 11661 1726882397.81680: variable 'ansible_search_path' from source: unknown 11661 1726882397.81685: variable 'ansible_search_path' from source: unknown 11661 1726882397.81714: calling self._execute() 11661 1726882397.81805: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882397.81811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882397.81820: variable 'omit' from source: magic vars 11661 1726882397.82122: variable 'ansible_distribution_major_version' from source: facts 11661 1726882397.82133: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882397.82138: variable 'omit' from source: magic vars 11661 1726882397.82199: variable 'omit' from source: magic vars 11661 1726882397.82227: variable 'omit' from source: magic vars 11661 1726882397.82264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882397.82292: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882397.82313: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882397.82330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882397.82339: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882397.82367: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882397.82371: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882397.82374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882397.82441: Set connection var ansible_connection to ssh 11661 1726882397.82445: Set connection var ansible_pipelining to False 11661 1726882397.82453: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882397.82460: Set connection var ansible_timeout to 10 11661 1726882397.82464: Set connection var ansible_shell_type to sh 11661 1726882397.82471: Set connection var ansible_shell_executable to /bin/sh 11661 1726882397.82488: variable 'ansible_shell_executable' from source: unknown 11661 1726882397.82491: variable 'ansible_connection' from source: unknown 11661 1726882397.82495: variable 'ansible_module_compression' from source: unknown 11661 1726882397.82497: variable 'ansible_shell_type' from source: unknown 11661 1726882397.82499: variable 'ansible_shell_executable' from source: unknown 11661 1726882397.82502: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882397.82504: variable 'ansible_pipelining' from source: unknown 11661 1726882397.82508: variable 'ansible_timeout' from source: unknown 11661 1726882397.82511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882397.82653: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11661 1726882397.82658: variable 'omit' from source: magic vars 11661 1726882397.82665: starting attempt loop 11661 1726882397.82676: running the handler 11661 1726882397.82685: _low_level_execute_command(): starting 11661 1726882397.82692: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882397.83290: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882397.83322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882397.83338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882397.83451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882397.85114: stdout chunk (state=3): >>>/root <<< 11661 1726882397.85215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882397.85268: stderr chunk (state=3): >>><<< 11661 1726882397.85272: stdout chunk (state=3): >>><<< 11661 1726882397.85290: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882397.85301: _low_level_execute_command(): starting 11661 1726882397.85307: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882397.8528993-12863-245626760326354 `" && echo ansible-tmp-1726882397.8528993-12863-245626760326354="` echo /root/.ansible/tmp/ansible-tmp-1726882397.8528993-12863-245626760326354 `" ) && sleep 0' 11661 1726882397.85769: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882397.85773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882397.85808: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882397.85811: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882397.85820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882397.85857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882397.85880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882397.85984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882397.87854: stdout chunk (state=3): >>>ansible-tmp-1726882397.8528993-12863-245626760326354=/root/.ansible/tmp/ansible-tmp-1726882397.8528993-12863-245626760326354 <<< 11661 1726882397.87966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882397.88017: stderr chunk (state=3): >>><<< 11661 1726882397.88020: stdout chunk (state=3): >>><<< 11661 1726882397.88034: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882397.8528993-12863-245626760326354=/root/.ansible/tmp/ansible-tmp-1726882397.8528993-12863-245626760326354 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882397.88078: variable 'ansible_module_compression' from source: unknown 11661 1726882397.88118: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 11661 1726882397.88175: variable 'ansible_facts' from source: unknown 11661 1726882397.88310: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882397.8528993-12863-245626760326354/AnsiballZ_package_facts.py 11661 1726882397.88428: Sending initial data 11661 1726882397.88437: Sent initial data (162 bytes) 11661 1726882397.89128: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882397.89133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882397.89169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882397.89173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882397.89175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882397.89177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882397.89231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882397.89234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882397.89241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882397.89339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882397.91089: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882397.91184: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882397.91280: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpm_vnjint /root/.ansible/tmp/ansible-tmp-1726882397.8528993-12863-245626760326354/AnsiballZ_package_facts.py <<< 11661 1726882397.91376: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882397.93679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882397.93783: stderr chunk (state=3): >>><<< 11661 1726882397.93787: stdout chunk (state=3): >>><<< 11661 1726882397.93802: done transferring module to remote 11661 1726882397.93811: _low_level_execute_command(): starting 11661 1726882397.93816: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882397.8528993-12863-245626760326354/ /root/.ansible/tmp/ansible-tmp-1726882397.8528993-12863-245626760326354/AnsiballZ_package_facts.py && sleep 0' 11661 1726882397.94343: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882397.94347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882397.94394: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11661 1726882397.94401: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882397.94415: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 11661 1726882397.94421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882397.94491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882397.94514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882397.94531: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882397.94666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882397.96459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882397.96557: stderr chunk (state=3): >>><<< 11661 1726882397.96572: stdout chunk (state=3): >>><<< 11661 1726882397.96676: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882397.96681: _low_level_execute_command(): starting 11661 1726882397.96683: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882397.8528993-12863-245626760326354/AnsiballZ_package_facts.py && sleep 0' 11661 1726882397.98063: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882397.98080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882397.98093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882397.98109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882397.98161: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882397.98174: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882397.98187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882397.98203: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882397.98213: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882397.98223: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882397.98237: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882397.98252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882397.98270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882397.98284: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882397.98295: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882397.98308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882397.98392: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882397.98417: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882397.98434: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882397.98581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882398.45057: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "e<<< 11661 1726882398.45083: stdout chunk (state=3): >>>poch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": nu<<< 11661 1726882398.45097: stdout chunk (state=3): >>>ll, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source"<<< 11661 1726882398.45105: stdout chunk (state=3): >>>: "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release<<< 11661 1726882398.45202: stdout chunk (state=3): >>>": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-<<< 11661 1726882398.45216: stdout chunk (state=3): >>>base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "sour<<< 11661 1726882398.45220: stdout chunk (state=3): >>>ce": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, <<< 11661 1726882398.45223: stdout chunk (state=3): >>>"arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300<<< 11661 1726882398.45230: stdout chunk (state=3): >>>", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 11661 1726882398.45263: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 11661 1726882398.45273: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11661 1726882398.46798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882398.46826: stderr chunk (state=3): >>><<< 11661 1726882398.46830: stdout chunk (state=3): >>><<< 11661 1726882398.46877: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882398.48475: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882397.8528993-12863-245626760326354/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882398.48491: _low_level_execute_command(): starting 11661 1726882398.48494: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882397.8528993-12863-245626760326354/ > /dev/null 2>&1 && sleep 0' 11661 1726882398.48936: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882398.48948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882398.48968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882398.48981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 11661 1726882398.48991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882398.49032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882398.49043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882398.49160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882398.50991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882398.51036: stderr chunk (state=3): >>><<< 11661 1726882398.51039: stdout chunk (state=3): >>><<< 11661 1726882398.51058: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882398.51068: handler run complete 11661 1726882398.51625: variable 'ansible_facts' from source: unknown 11661 1726882398.51935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882398.53133: variable 'ansible_facts' from source: unknown 11661 1726882398.53411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882398.53865: attempt loop complete, returning result 11661 1726882398.53880: _execute() done 11661 1726882398.53883: dumping result to json 11661 1726882398.54014: done dumping result, returning 11661 1726882398.54022: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-896b-2321-000000000497] 11661 1726882398.54028: sending task result for task 0e448fcc-3ce9-896b-2321-000000000497 11661 1726882398.55439: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000497 11661 1726882398.55443: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11661 1726882398.55537: no more pending results, returning what we have 11661 1726882398.55539: results queue empty 11661 1726882398.55539: checking for any_errors_fatal 11661 1726882398.55543: done checking for any_errors_fatal 11661 1726882398.55544: checking for max_fail_percentage 11661 1726882398.55545: done checking for max_fail_percentage 11661 1726882398.55545: checking to see if all hosts have failed and the running result is not ok 11661 1726882398.55546: done checking to see if all hosts have failed 11661 1726882398.55546: getting the remaining hosts for this loop 11661 1726882398.55549: done getting the remaining hosts for this loop 11661 1726882398.55555: getting the next task for host managed_node2 11661 1726882398.55561: done getting next task for host managed_node2 11661 1726882398.55565: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11661 1726882398.55568: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882398.55576: getting variables 11661 1726882398.55577: in VariableManager get_vars() 11661 1726882398.55601: Calling all_inventory to load vars for managed_node2 11661 1726882398.55603: Calling groups_inventory to load vars for managed_node2 11661 1726882398.55605: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882398.55611: Calling all_plugins_play to load vars for managed_node2 11661 1726882398.55613: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882398.55615: Calling groups_plugins_play to load vars for managed_node2 11661 1726882398.56349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882398.57377: done with get_vars() 11661 1726882398.57395: done getting variables 11661 1726882398.57440: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:33:18 -0400 (0:00:00.763) 0:00:27.288 ****** 11661 1726882398.57476: entering _queue_task() for managed_node2/debug 11661 1726882398.57720: worker is 1 (out of 1 available) 11661 1726882398.57734: exiting _queue_task() for managed_node2/debug 11661 1726882398.57747: done queuing things up, now waiting for results queue to drain 11661 1726882398.57748: waiting for pending results... 11661 1726882398.57944: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 11661 1726882398.58043: in run() - task 0e448fcc-3ce9-896b-2321-00000000007d 11661 1726882398.58060: variable 'ansible_search_path' from source: unknown 11661 1726882398.58064: variable 'ansible_search_path' from source: unknown 11661 1726882398.58098: calling self._execute() 11661 1726882398.58179: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882398.58184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882398.58193: variable 'omit' from source: magic vars 11661 1726882398.58472: variable 'ansible_distribution_major_version' from source: facts 11661 1726882398.58482: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882398.58489: variable 'omit' from source: magic vars 11661 1726882398.58531: variable 'omit' from source: magic vars 11661 1726882398.58601: variable 'network_provider' from source: set_fact 11661 1726882398.58662: variable 'omit' from source: magic vars 11661 1726882398.58667: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882398.58695: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882398.58713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882398.58728: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882398.58737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882398.58763: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882398.58768: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882398.58771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882398.58840: Set connection var ansible_connection to ssh 11661 1726882398.58843: Set connection var ansible_pipelining to False 11661 1726882398.58849: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882398.58857: Set connection var ansible_timeout to 10 11661 1726882398.58859: Set connection var ansible_shell_type to sh 11661 1726882398.58866: Set connection var ansible_shell_executable to /bin/sh 11661 1726882398.58886: variable 'ansible_shell_executable' from source: unknown 11661 1726882398.58890: variable 'ansible_connection' from source: unknown 11661 1726882398.58893: variable 'ansible_module_compression' from source: unknown 11661 1726882398.58895: variable 'ansible_shell_type' from source: unknown 11661 1726882398.58898: variable 'ansible_shell_executable' from source: unknown 11661 1726882398.58900: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882398.58902: variable 'ansible_pipelining' from source: unknown 11661 1726882398.58905: variable 'ansible_timeout' from source: unknown 11661 1726882398.58907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882398.59006: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882398.59017: variable 'omit' from source: magic vars 11661 1726882398.59020: starting attempt loop 11661 1726882398.59023: running the handler 11661 1726882398.59061: handler run complete 11661 1726882398.59073: attempt loop complete, returning result 11661 1726882398.59076: _execute() done 11661 1726882398.59079: dumping result to json 11661 1726882398.59081: done dumping result, returning 11661 1726882398.59088: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-896b-2321-00000000007d] 11661 1726882398.59093: sending task result for task 0e448fcc-3ce9-896b-2321-00000000007d 11661 1726882398.59185: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000007d 11661 1726882398.59187: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 11661 1726882398.59245: no more pending results, returning what we have 11661 1726882398.59248: results queue empty 11661 1726882398.59249: checking for any_errors_fatal 11661 1726882398.59269: done checking for any_errors_fatal 11661 1726882398.59270: checking for max_fail_percentage 11661 1726882398.59271: done checking for max_fail_percentage 11661 1726882398.59272: checking to see if all hosts have failed and the running result is not ok 11661 1726882398.59273: done checking to see if all hosts have failed 11661 1726882398.59273: getting the remaining hosts for this loop 11661 1726882398.59275: done getting the remaining hosts for this loop 11661 1726882398.59278: getting the next task for host managed_node2 11661 1726882398.59286: done getting next task for host managed_node2 11661 1726882398.59290: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11661 1726882398.59293: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882398.59304: getting variables 11661 1726882398.59306: in VariableManager get_vars() 11661 1726882398.59339: Calling all_inventory to load vars for managed_node2 11661 1726882398.59341: Calling groups_inventory to load vars for managed_node2 11661 1726882398.59343: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882398.59354: Calling all_plugins_play to load vars for managed_node2 11661 1726882398.59356: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882398.59359: Calling groups_plugins_play to load vars for managed_node2 11661 1726882398.60713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882398.61799: done with get_vars() 11661 1726882398.61821: done getting variables 11661 1726882398.61871: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:33:18 -0400 (0:00:00.044) 0:00:27.332 ****** 11661 1726882398.61899: entering _queue_task() for managed_node2/fail 11661 1726882398.62143: worker is 1 (out of 1 available) 11661 1726882398.62156: exiting _queue_task() for managed_node2/fail 11661 1726882398.62172: done queuing things up, now waiting for results queue to drain 11661 1726882398.62174: waiting for pending results... 11661 1726882398.62370: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11661 1726882398.62473: in run() - task 0e448fcc-3ce9-896b-2321-00000000007e 11661 1726882398.62485: variable 'ansible_search_path' from source: unknown 11661 1726882398.62488: variable 'ansible_search_path' from source: unknown 11661 1726882398.62521: calling self._execute() 11661 1726882398.62604: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882398.62609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882398.62619: variable 'omit' from source: magic vars 11661 1726882398.63173: variable 'ansible_distribution_major_version' from source: facts 11661 1726882398.63178: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882398.63272: variable 'network_state' from source: role '' defaults 11661 1726882398.63278: Evaluated conditional (network_state != {}): False 11661 1726882398.63281: when evaluation is False, skipping this task 11661 1726882398.63284: _execute() done 11661 1726882398.63286: dumping result to json 11661 1726882398.63289: done dumping result, returning 11661 1726882398.63292: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-896b-2321-00000000007e] 11661 1726882398.63294: sending task result for task 0e448fcc-3ce9-896b-2321-00000000007e 11661 1726882398.63536: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000007e 11661 1726882398.63539: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11661 1726882398.63656: no more pending results, returning what we have 11661 1726882398.63659: results queue empty 11661 1726882398.63660: checking for any_errors_fatal 11661 1726882398.63667: done checking for any_errors_fatal 11661 1726882398.63668: checking for max_fail_percentage 11661 1726882398.63669: done checking for max_fail_percentage 11661 1726882398.63670: checking to see if all hosts have failed and the running result is not ok 11661 1726882398.63671: done checking to see if all hosts have failed 11661 1726882398.63671: getting the remaining hosts for this loop 11661 1726882398.63673: done getting the remaining hosts for this loop 11661 1726882398.63676: getting the next task for host managed_node2 11661 1726882398.63682: done getting next task for host managed_node2 11661 1726882398.63686: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11661 1726882398.63689: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882398.63705: getting variables 11661 1726882398.63706: in VariableManager get_vars() 11661 1726882398.63742: Calling all_inventory to load vars for managed_node2 11661 1726882398.63744: Calling groups_inventory to load vars for managed_node2 11661 1726882398.63746: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882398.63757: Calling all_plugins_play to load vars for managed_node2 11661 1726882398.63759: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882398.63762: Calling groups_plugins_play to load vars for managed_node2 11661 1726882398.65355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882398.67107: done with get_vars() 11661 1726882398.67137: done getting variables 11661 1726882398.67201: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:33:18 -0400 (0:00:00.053) 0:00:27.385 ****** 11661 1726882398.67237: entering _queue_task() for managed_node2/fail 11661 1726882398.67582: worker is 1 (out of 1 available) 11661 1726882398.67594: exiting _queue_task() for managed_node2/fail 11661 1726882398.67605: done queuing things up, now waiting for results queue to drain 11661 1726882398.67607: waiting for pending results... 11661 1726882398.67946: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11661 1726882398.68134: in run() - task 0e448fcc-3ce9-896b-2321-00000000007f 11661 1726882398.68145: variable 'ansible_search_path' from source: unknown 11661 1726882398.68149: variable 'ansible_search_path' from source: unknown 11661 1726882398.68195: calling self._execute() 11661 1726882398.68344: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882398.68362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882398.68380: variable 'omit' from source: magic vars 11661 1726882398.68823: variable 'ansible_distribution_major_version' from source: facts 11661 1726882398.68859: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882398.68996: variable 'network_state' from source: role '' defaults 11661 1726882398.69023: Evaluated conditional (network_state != {}): False 11661 1726882398.69031: when evaluation is False, skipping this task 11661 1726882398.69039: _execute() done 11661 1726882398.69046: dumping result to json 11661 1726882398.69063: done dumping result, returning 11661 1726882398.69078: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-896b-2321-00000000007f] 11661 1726882398.69096: sending task result for task 0e448fcc-3ce9-896b-2321-00000000007f skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11661 1726882398.69278: no more pending results, returning what we have 11661 1726882398.69282: results queue empty 11661 1726882398.69283: checking for any_errors_fatal 11661 1726882398.69291: done checking for any_errors_fatal 11661 1726882398.69292: checking for max_fail_percentage 11661 1726882398.69294: done checking for max_fail_percentage 11661 1726882398.69294: checking to see if all hosts have failed and the running result is not ok 11661 1726882398.69295: done checking to see if all hosts have failed 11661 1726882398.69296: getting the remaining hosts for this loop 11661 1726882398.69298: done getting the remaining hosts for this loop 11661 1726882398.69302: getting the next task for host managed_node2 11661 1726882398.69311: done getting next task for host managed_node2 11661 1726882398.69317: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11661 1726882398.69320: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882398.69342: getting variables 11661 1726882398.69344: in VariableManager get_vars() 11661 1726882398.69393: Calling all_inventory to load vars for managed_node2 11661 1726882398.69396: Calling groups_inventory to load vars for managed_node2 11661 1726882398.69399: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882398.69412: Calling all_plugins_play to load vars for managed_node2 11661 1726882398.69415: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882398.69419: Calling groups_plugins_play to load vars for managed_node2 11661 1726882398.70885: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000007f 11661 1726882398.70889: WORKER PROCESS EXITING 11661 1726882398.71234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882398.73010: done with get_vars() 11661 1726882398.73035: done getting variables 11661 1726882398.73101: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:33:18 -0400 (0:00:00.058) 0:00:27.444 ****** 11661 1726882398.73137: entering _queue_task() for managed_node2/fail 11661 1726882398.73446: worker is 1 (out of 1 available) 11661 1726882398.73462: exiting _queue_task() for managed_node2/fail 11661 1726882398.73476: done queuing things up, now waiting for results queue to drain 11661 1726882398.73477: waiting for pending results... 11661 1726882398.73778: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11661 1726882398.73937: in run() - task 0e448fcc-3ce9-896b-2321-000000000080 11661 1726882398.73959: variable 'ansible_search_path' from source: unknown 11661 1726882398.73971: variable 'ansible_search_path' from source: unknown 11661 1726882398.74012: calling self._execute() 11661 1726882398.74117: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882398.74127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882398.74143: variable 'omit' from source: magic vars 11661 1726882398.74541: variable 'ansible_distribution_major_version' from source: facts 11661 1726882398.74567: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882398.74765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882398.77636: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882398.77717: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882398.77758: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882398.77793: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882398.77825: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882398.77914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882398.77947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882398.77979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882398.78021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882398.78040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882398.78141: variable 'ansible_distribution_major_version' from source: facts 11661 1726882398.78168: Evaluated conditional (ansible_distribution_major_version | int > 9): False 11661 1726882398.78175: when evaluation is False, skipping this task 11661 1726882398.78180: _execute() done 11661 1726882398.78186: dumping result to json 11661 1726882398.78191: done dumping result, returning 11661 1726882398.78202: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-896b-2321-000000000080] 11661 1726882398.78210: sending task result for task 0e448fcc-3ce9-896b-2321-000000000080 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 11661 1726882398.78365: no more pending results, returning what we have 11661 1726882398.78370: results queue empty 11661 1726882398.78371: checking for any_errors_fatal 11661 1726882398.78377: done checking for any_errors_fatal 11661 1726882398.78377: checking for max_fail_percentage 11661 1726882398.78379: done checking for max_fail_percentage 11661 1726882398.78380: checking to see if all hosts have failed and the running result is not ok 11661 1726882398.78381: done checking to see if all hosts have failed 11661 1726882398.78382: getting the remaining hosts for this loop 11661 1726882398.78383: done getting the remaining hosts for this loop 11661 1726882398.78387: getting the next task for host managed_node2 11661 1726882398.78396: done getting next task for host managed_node2 11661 1726882398.78401: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11661 1726882398.78404: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882398.78425: getting variables 11661 1726882398.78427: in VariableManager get_vars() 11661 1726882398.78471: Calling all_inventory to load vars for managed_node2 11661 1726882398.78474: Calling groups_inventory to load vars for managed_node2 11661 1726882398.78477: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882398.78487: Calling all_plugins_play to load vars for managed_node2 11661 1726882398.78490: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882398.78493: Calling groups_plugins_play to load vars for managed_node2 11661 1726882398.79484: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000080 11661 1726882398.79488: WORKER PROCESS EXITING 11661 1726882398.80385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882398.82102: done with get_vars() 11661 1726882398.82136: done getting variables 11661 1726882398.82208: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:33:18 -0400 (0:00:00.091) 0:00:27.535 ****** 11661 1726882398.82244: entering _queue_task() for managed_node2/dnf 11661 1726882398.82592: worker is 1 (out of 1 available) 11661 1726882398.82605: exiting _queue_task() for managed_node2/dnf 11661 1726882398.82618: done queuing things up, now waiting for results queue to drain 11661 1726882398.82619: waiting for pending results... 11661 1726882398.82929: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11661 1726882398.83097: in run() - task 0e448fcc-3ce9-896b-2321-000000000081 11661 1726882398.83115: variable 'ansible_search_path' from source: unknown 11661 1726882398.83123: variable 'ansible_search_path' from source: unknown 11661 1726882398.83167: calling self._execute() 11661 1726882398.83278: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882398.83292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882398.83306: variable 'omit' from source: magic vars 11661 1726882398.83694: variable 'ansible_distribution_major_version' from source: facts 11661 1726882398.83711: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882398.83925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882398.86410: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882398.86495: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882398.86543: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882398.86589: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882398.86624: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882398.86732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882398.86776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882398.86806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882398.86854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882398.86880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882398.87015: variable 'ansible_distribution' from source: facts 11661 1726882398.87025: variable 'ansible_distribution_major_version' from source: facts 11661 1726882398.87043: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11661 1726882398.87181: variable '__network_wireless_connections_defined' from source: role '' defaults 11661 1726882398.87325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882398.87357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882398.87391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882398.87441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882398.87468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882398.87516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882398.87545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882398.87580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882398.87627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882398.87648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882398.87696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882398.87727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882398.87762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882398.87808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882398.87826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882398.88000: variable 'network_connections' from source: task vars 11661 1726882398.88017: variable 'port2_profile' from source: play vars 11661 1726882398.88090: variable 'port2_profile' from source: play vars 11661 1726882398.88104: variable 'port1_profile' from source: play vars 11661 1726882398.88174: variable 'port1_profile' from source: play vars 11661 1726882398.88187: variable 'controller_profile' from source: play vars 11661 1726882398.88250: variable 'controller_profile' from source: play vars 11661 1726882398.88332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11661 1726882398.88494: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11661 1726882398.88533: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11661 1726882398.88573: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11661 1726882398.88607: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11661 1726882398.88656: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11661 1726882398.88693: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11661 1726882398.88719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882398.88748: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11661 1726882398.88806: variable '__network_team_connections_defined' from source: role '' defaults 11661 1726882398.89073: variable 'network_connections' from source: task vars 11661 1726882398.89084: variable 'port2_profile' from source: play vars 11661 1726882398.89147: variable 'port2_profile' from source: play vars 11661 1726882398.89168: variable 'port1_profile' from source: play vars 11661 1726882398.89230: variable 'port1_profile' from source: play vars 11661 1726882398.89245: variable 'controller_profile' from source: play vars 11661 1726882398.89318: variable 'controller_profile' from source: play vars 11661 1726882398.89346: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11661 1726882398.89356: when evaluation is False, skipping this task 11661 1726882398.89362: _execute() done 11661 1726882398.89371: dumping result to json 11661 1726882398.89382: done dumping result, returning 11661 1726882398.89393: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-896b-2321-000000000081] 11661 1726882398.89401: sending task result for task 0e448fcc-3ce9-896b-2321-000000000081 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11661 1726882398.89557: no more pending results, returning what we have 11661 1726882398.89562: results queue empty 11661 1726882398.89563: checking for any_errors_fatal 11661 1726882398.89572: done checking for any_errors_fatal 11661 1726882398.89572: checking for max_fail_percentage 11661 1726882398.89575: done checking for max_fail_percentage 11661 1726882398.89575: checking to see if all hosts have failed and the running result is not ok 11661 1726882398.89576: done checking to see if all hosts have failed 11661 1726882398.89577: getting the remaining hosts for this loop 11661 1726882398.89579: done getting the remaining hosts for this loop 11661 1726882398.89583: getting the next task for host managed_node2 11661 1726882398.89591: done getting next task for host managed_node2 11661 1726882398.89596: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11661 1726882398.89599: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882398.89618: getting variables 11661 1726882398.89620: in VariableManager get_vars() 11661 1726882398.89665: Calling all_inventory to load vars for managed_node2 11661 1726882398.89668: Calling groups_inventory to load vars for managed_node2 11661 1726882398.89671: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882398.89681: Calling all_plugins_play to load vars for managed_node2 11661 1726882398.89684: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882398.89687: Calling groups_plugins_play to load vars for managed_node2 11661 1726882398.90683: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000081 11661 1726882398.90686: WORKER PROCESS EXITING 11661 1726882398.91442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882398.93334: done with get_vars() 11661 1726882398.93358: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11661 1726882398.93433: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:33:18 -0400 (0:00:00.112) 0:00:27.648 ****** 11661 1726882398.93472: entering _queue_task() for managed_node2/yum 11661 1726882398.93787: worker is 1 (out of 1 available) 11661 1726882398.93800: exiting _queue_task() for managed_node2/yum 11661 1726882398.93813: done queuing things up, now waiting for results queue to drain 11661 1726882398.93814: waiting for pending results... 11661 1726882398.94110: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11661 1726882398.94268: in run() - task 0e448fcc-3ce9-896b-2321-000000000082 11661 1726882398.94287: variable 'ansible_search_path' from source: unknown 11661 1726882398.94295: variable 'ansible_search_path' from source: unknown 11661 1726882398.94331: calling self._execute() 11661 1726882398.94438: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882398.94450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882398.94470: variable 'omit' from source: magic vars 11661 1726882398.94846: variable 'ansible_distribution_major_version' from source: facts 11661 1726882398.94867: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882398.95050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882398.97404: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882398.97486: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882398.97537: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882398.97583: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882398.97616: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882398.97711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882398.97767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882398.97800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882398.97850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882398.97876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882398.97984: variable 'ansible_distribution_major_version' from source: facts 11661 1726882398.98006: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11661 1726882398.98013: when evaluation is False, skipping this task 11661 1726882398.98020: _execute() done 11661 1726882398.98027: dumping result to json 11661 1726882398.98035: done dumping result, returning 11661 1726882398.98047: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-896b-2321-000000000082] 11661 1726882398.98060: sending task result for task 0e448fcc-3ce9-896b-2321-000000000082 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11661 1726882398.98230: no more pending results, returning what we have 11661 1726882398.98234: results queue empty 11661 1726882398.98235: checking for any_errors_fatal 11661 1726882398.98241: done checking for any_errors_fatal 11661 1726882398.98242: checking for max_fail_percentage 11661 1726882398.98245: done checking for max_fail_percentage 11661 1726882398.98245: checking to see if all hosts have failed and the running result is not ok 11661 1726882398.98246: done checking to see if all hosts have failed 11661 1726882398.98247: getting the remaining hosts for this loop 11661 1726882398.98249: done getting the remaining hosts for this loop 11661 1726882398.98255: getting the next task for host managed_node2 11661 1726882398.98268: done getting next task for host managed_node2 11661 1726882398.98272: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11661 1726882398.98276: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882398.98296: getting variables 11661 1726882398.98298: in VariableManager get_vars() 11661 1726882398.98341: Calling all_inventory to load vars for managed_node2 11661 1726882398.98344: Calling groups_inventory to load vars for managed_node2 11661 1726882398.98346: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882398.98360: Calling all_plugins_play to load vars for managed_node2 11661 1726882398.98365: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882398.98369: Calling groups_plugins_play to load vars for managed_node2 11661 1726882398.99382: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000082 11661 1726882398.99385: WORKER PROCESS EXITING 11661 1726882399.00130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882399.01913: done with get_vars() 11661 1726882399.01946: done getting variables 11661 1726882399.02012: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:33:19 -0400 (0:00:00.085) 0:00:27.733 ****** 11661 1726882399.02053: entering _queue_task() for managed_node2/fail 11661 1726882399.02396: worker is 1 (out of 1 available) 11661 1726882399.02409: exiting _queue_task() for managed_node2/fail 11661 1726882399.02421: done queuing things up, now waiting for results queue to drain 11661 1726882399.02423: waiting for pending results... 11661 1726882399.02729: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11661 1726882399.02885: in run() - task 0e448fcc-3ce9-896b-2321-000000000083 11661 1726882399.02908: variable 'ansible_search_path' from source: unknown 11661 1726882399.02917: variable 'ansible_search_path' from source: unknown 11661 1726882399.02966: calling self._execute() 11661 1726882399.03086: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882399.03099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882399.03115: variable 'omit' from source: magic vars 11661 1726882399.03533: variable 'ansible_distribution_major_version' from source: facts 11661 1726882399.03556: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882399.03688: variable '__network_wireless_connections_defined' from source: role '' defaults 11661 1726882399.03912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882399.06618: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882399.06679: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882399.06702: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882399.06729: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882399.06748: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882399.06814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882399.06835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882399.06852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882399.06884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882399.06895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882399.06927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882399.06944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882399.06963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882399.06989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882399.07002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882399.07029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882399.07048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882399.07069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882399.07094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882399.07109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882399.07227: variable 'network_connections' from source: task vars 11661 1726882399.07236: variable 'port2_profile' from source: play vars 11661 1726882399.07289: variable 'port2_profile' from source: play vars 11661 1726882399.07297: variable 'port1_profile' from source: play vars 11661 1726882399.07342: variable 'port1_profile' from source: play vars 11661 1726882399.07348: variable 'controller_profile' from source: play vars 11661 1726882399.07394: variable 'controller_profile' from source: play vars 11661 1726882399.07445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11661 1726882399.07572: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11661 1726882399.07600: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11661 1726882399.07622: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11661 1726882399.07643: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11661 1726882399.07678: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11661 1726882399.07696: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11661 1726882399.07711: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882399.07728: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11661 1726882399.07772: variable '__network_team_connections_defined' from source: role '' defaults 11661 1726882399.07923: variable 'network_connections' from source: task vars 11661 1726882399.07927: variable 'port2_profile' from source: play vars 11661 1726882399.07970: variable 'port2_profile' from source: play vars 11661 1726882399.07978: variable 'port1_profile' from source: play vars 11661 1726882399.08020: variable 'port1_profile' from source: play vars 11661 1726882399.08025: variable 'controller_profile' from source: play vars 11661 1726882399.08072: variable 'controller_profile' from source: play vars 11661 1726882399.08095: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11661 1726882399.08107: when evaluation is False, skipping this task 11661 1726882399.08109: _execute() done 11661 1726882399.08112: dumping result to json 11661 1726882399.08114: done dumping result, returning 11661 1726882399.08116: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-896b-2321-000000000083] 11661 1726882399.08119: sending task result for task 0e448fcc-3ce9-896b-2321-000000000083 11661 1726882399.08501: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000083 11661 1726882399.08505: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11661 1726882399.08570: no more pending results, returning what we have 11661 1726882399.08573: results queue empty 11661 1726882399.08574: checking for any_errors_fatal 11661 1726882399.08580: done checking for any_errors_fatal 11661 1726882399.08581: checking for max_fail_percentage 11661 1726882399.08583: done checking for max_fail_percentage 11661 1726882399.08584: checking to see if all hosts have failed and the running result is not ok 11661 1726882399.08585: done checking to see if all hosts have failed 11661 1726882399.08586: getting the remaining hosts for this loop 11661 1726882399.08587: done getting the remaining hosts for this loop 11661 1726882399.08591: getting the next task for host managed_node2 11661 1726882399.08597: done getting next task for host managed_node2 11661 1726882399.08602: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11661 1726882399.08608: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882399.08625: getting variables 11661 1726882399.08626: in VariableManager get_vars() 11661 1726882399.08669: Calling all_inventory to load vars for managed_node2 11661 1726882399.08673: Calling groups_inventory to load vars for managed_node2 11661 1726882399.08676: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882399.08686: Calling all_plugins_play to load vars for managed_node2 11661 1726882399.08689: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882399.08692: Calling groups_plugins_play to load vars for managed_node2 11661 1726882399.09932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882399.10875: done with get_vars() 11661 1726882399.10892: done getting variables 11661 1726882399.10939: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:33:19 -0400 (0:00:00.089) 0:00:27.823 ****** 11661 1726882399.10970: entering _queue_task() for managed_node2/package 11661 1726882399.11208: worker is 1 (out of 1 available) 11661 1726882399.11223: exiting _queue_task() for managed_node2/package 11661 1726882399.11235: done queuing things up, now waiting for results queue to drain 11661 1726882399.11236: waiting for pending results... 11661 1726882399.11508: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 11661 1726882399.11666: in run() - task 0e448fcc-3ce9-896b-2321-000000000084 11661 1726882399.11687: variable 'ansible_search_path' from source: unknown 11661 1726882399.11698: variable 'ansible_search_path' from source: unknown 11661 1726882399.11744: calling self._execute() 11661 1726882399.11866: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882399.11879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882399.11895: variable 'omit' from source: magic vars 11661 1726882399.12291: variable 'ansible_distribution_major_version' from source: facts 11661 1726882399.12309: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882399.12526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11661 1726882399.12821: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11661 1726882399.12875: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11661 1726882399.12921: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11661 1726882399.13002: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11661 1726882399.13127: variable 'network_packages' from source: role '' defaults 11661 1726882399.13249: variable '__network_provider_setup' from source: role '' defaults 11661 1726882399.13280: variable '__network_service_name_default_nm' from source: role '' defaults 11661 1726882399.13343: variable '__network_service_name_default_nm' from source: role '' defaults 11661 1726882399.13356: variable '__network_packages_default_nm' from source: role '' defaults 11661 1726882399.13407: variable '__network_packages_default_nm' from source: role '' defaults 11661 1726882399.13533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882399.14981: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882399.15026: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882399.15057: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882399.15082: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882399.15107: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882399.15194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882399.15217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882399.15248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882399.15289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882399.15299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882399.15337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882399.15372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882399.15391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882399.15416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882399.15452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882399.15654: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11661 1726882399.15770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882399.15800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882399.15830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882399.15878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882399.15897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882399.15990: variable 'ansible_python' from source: facts 11661 1726882399.16020: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11661 1726882399.16107: variable '__network_wpa_supplicant_required' from source: role '' defaults 11661 1726882399.16191: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11661 1726882399.16336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882399.16368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882399.16400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882399.16440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882399.16455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882399.16487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882399.16509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882399.16540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882399.16567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882399.16579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882399.16698: variable 'network_connections' from source: task vars 11661 1726882399.16702: variable 'port2_profile' from source: play vars 11661 1726882399.16841: variable 'port2_profile' from source: play vars 11661 1726882399.16844: variable 'port1_profile' from source: play vars 11661 1726882399.16930: variable 'port1_profile' from source: play vars 11661 1726882399.16954: variable 'controller_profile' from source: play vars 11661 1726882399.17055: variable 'controller_profile' from source: play vars 11661 1726882399.17135: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11661 1726882399.17179: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11661 1726882399.17214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882399.17254: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11661 1726882399.17315: variable '__network_wireless_connections_defined' from source: role '' defaults 11661 1726882399.17617: variable 'network_connections' from source: task vars 11661 1726882399.17627: variable 'port2_profile' from source: play vars 11661 1726882399.17736: variable 'port2_profile' from source: play vars 11661 1726882399.17754: variable 'port1_profile' from source: play vars 11661 1726882399.17866: variable 'port1_profile' from source: play vars 11661 1726882399.17881: variable 'controller_profile' from source: play vars 11661 1726882399.17991: variable 'controller_profile' from source: play vars 11661 1726882399.18054: variable '__network_packages_default_wireless' from source: role '' defaults 11661 1726882399.18116: variable '__network_wireless_connections_defined' from source: role '' defaults 11661 1726882399.18343: variable 'network_connections' from source: task vars 11661 1726882399.18346: variable 'port2_profile' from source: play vars 11661 1726882399.18399: variable 'port2_profile' from source: play vars 11661 1726882399.18405: variable 'port1_profile' from source: play vars 11661 1726882399.18449: variable 'port1_profile' from source: play vars 11661 1726882399.18457: variable 'controller_profile' from source: play vars 11661 1726882399.18505: variable 'controller_profile' from source: play vars 11661 1726882399.18523: variable '__network_packages_default_team' from source: role '' defaults 11661 1726882399.18579: variable '__network_team_connections_defined' from source: role '' defaults 11661 1726882399.18778: variable 'network_connections' from source: task vars 11661 1726882399.18781: variable 'port2_profile' from source: play vars 11661 1726882399.18828: variable 'port2_profile' from source: play vars 11661 1726882399.18834: variable 'port1_profile' from source: play vars 11661 1726882399.18882: variable 'port1_profile' from source: play vars 11661 1726882399.18888: variable 'controller_profile' from source: play vars 11661 1726882399.18935: variable 'controller_profile' from source: play vars 11661 1726882399.18977: variable '__network_service_name_default_initscripts' from source: role '' defaults 11661 1726882399.19019: variable '__network_service_name_default_initscripts' from source: role '' defaults 11661 1726882399.19023: variable '__network_packages_default_initscripts' from source: role '' defaults 11661 1726882399.19071: variable '__network_packages_default_initscripts' from source: role '' defaults 11661 1726882399.19206: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11661 1726882399.19501: variable 'network_connections' from source: task vars 11661 1726882399.19504: variable 'port2_profile' from source: play vars 11661 1726882399.19546: variable 'port2_profile' from source: play vars 11661 1726882399.19552: variable 'port1_profile' from source: play vars 11661 1726882399.19599: variable 'port1_profile' from source: play vars 11661 1726882399.19605: variable 'controller_profile' from source: play vars 11661 1726882399.19646: variable 'controller_profile' from source: play vars 11661 1726882399.19652: variable 'ansible_distribution' from source: facts 11661 1726882399.19658: variable '__network_rh_distros' from source: role '' defaults 11661 1726882399.19665: variable 'ansible_distribution_major_version' from source: facts 11661 1726882399.19681: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11661 1726882399.19786: variable 'ansible_distribution' from source: facts 11661 1726882399.19789: variable '__network_rh_distros' from source: role '' defaults 11661 1726882399.19791: variable 'ansible_distribution_major_version' from source: facts 11661 1726882399.19805: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11661 1726882399.19959: variable 'ansible_distribution' from source: facts 11661 1726882399.19970: variable '__network_rh_distros' from source: role '' defaults 11661 1726882399.19979: variable 'ansible_distribution_major_version' from source: facts 11661 1726882399.20050: variable 'network_provider' from source: set_fact 11661 1726882399.20073: variable 'ansible_facts' from source: unknown 11661 1726882399.20996: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11661 1726882399.21000: when evaluation is False, skipping this task 11661 1726882399.21002: _execute() done 11661 1726882399.21004: dumping result to json 11661 1726882399.21007: done dumping result, returning 11661 1726882399.21017: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-896b-2321-000000000084] 11661 1726882399.21020: sending task result for task 0e448fcc-3ce9-896b-2321-000000000084 11661 1726882399.21133: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000084 11661 1726882399.21135: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11661 1726882399.21184: no more pending results, returning what we have 11661 1726882399.21189: results queue empty 11661 1726882399.21189: checking for any_errors_fatal 11661 1726882399.21197: done checking for any_errors_fatal 11661 1726882399.21198: checking for max_fail_percentage 11661 1726882399.21200: done checking for max_fail_percentage 11661 1726882399.21201: checking to see if all hosts have failed and the running result is not ok 11661 1726882399.21201: done checking to see if all hosts have failed 11661 1726882399.21202: getting the remaining hosts for this loop 11661 1726882399.21204: done getting the remaining hosts for this loop 11661 1726882399.21212: getting the next task for host managed_node2 11661 1726882399.21219: done getting next task for host managed_node2 11661 1726882399.21224: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11661 1726882399.21227: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882399.21259: getting variables 11661 1726882399.21261: in VariableManager get_vars() 11661 1726882399.21300: Calling all_inventory to load vars for managed_node2 11661 1726882399.21302: Calling groups_inventory to load vars for managed_node2 11661 1726882399.21304: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882399.21313: Calling all_plugins_play to load vars for managed_node2 11661 1726882399.21315: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882399.21317: Calling groups_plugins_play to load vars for managed_node2 11661 1726882399.22160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882399.23207: done with get_vars() 11661 1726882399.23223: done getting variables 11661 1726882399.23272: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:33:19 -0400 (0:00:00.123) 0:00:27.946 ****** 11661 1726882399.23298: entering _queue_task() for managed_node2/package 11661 1726882399.23523: worker is 1 (out of 1 available) 11661 1726882399.23537: exiting _queue_task() for managed_node2/package 11661 1726882399.23550: done queuing things up, now waiting for results queue to drain 11661 1726882399.23554: waiting for pending results... 11661 1726882399.23742: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11661 1726882399.23831: in run() - task 0e448fcc-3ce9-896b-2321-000000000085 11661 1726882399.23843: variable 'ansible_search_path' from source: unknown 11661 1726882399.23846: variable 'ansible_search_path' from source: unknown 11661 1726882399.23879: calling self._execute() 11661 1726882399.23958: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882399.23962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882399.23972: variable 'omit' from source: magic vars 11661 1726882399.24245: variable 'ansible_distribution_major_version' from source: facts 11661 1726882399.24254: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882399.24344: variable 'network_state' from source: role '' defaults 11661 1726882399.24356: Evaluated conditional (network_state != {}): False 11661 1726882399.24359: when evaluation is False, skipping this task 11661 1726882399.24362: _execute() done 11661 1726882399.24369: dumping result to json 11661 1726882399.24371: done dumping result, returning 11661 1726882399.24374: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-896b-2321-000000000085] 11661 1726882399.24377: sending task result for task 0e448fcc-3ce9-896b-2321-000000000085 11661 1726882399.24472: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000085 11661 1726882399.24474: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11661 1726882399.24524: no more pending results, returning what we have 11661 1726882399.24528: results queue empty 11661 1726882399.24529: checking for any_errors_fatal 11661 1726882399.24535: done checking for any_errors_fatal 11661 1726882399.24536: checking for max_fail_percentage 11661 1726882399.24538: done checking for max_fail_percentage 11661 1726882399.24538: checking to see if all hosts have failed and the running result is not ok 11661 1726882399.24539: done checking to see if all hosts have failed 11661 1726882399.24540: getting the remaining hosts for this loop 11661 1726882399.24541: done getting the remaining hosts for this loop 11661 1726882399.24544: getting the next task for host managed_node2 11661 1726882399.24554: done getting next task for host managed_node2 11661 1726882399.24558: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11661 1726882399.24561: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882399.24579: getting variables 11661 1726882399.24581: in VariableManager get_vars() 11661 1726882399.24620: Calling all_inventory to load vars for managed_node2 11661 1726882399.24623: Calling groups_inventory to load vars for managed_node2 11661 1726882399.24625: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882399.24633: Calling all_plugins_play to load vars for managed_node2 11661 1726882399.24636: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882399.24638: Calling groups_plugins_play to load vars for managed_node2 11661 1726882399.25420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882399.26367: done with get_vars() 11661 1726882399.26384: done getting variables 11661 1726882399.26428: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:33:19 -0400 (0:00:00.031) 0:00:27.977 ****** 11661 1726882399.26456: entering _queue_task() for managed_node2/package 11661 1726882399.26684: worker is 1 (out of 1 available) 11661 1726882399.26698: exiting _queue_task() for managed_node2/package 11661 1726882399.26710: done queuing things up, now waiting for results queue to drain 11661 1726882399.26712: waiting for pending results... 11661 1726882399.26895: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11661 1726882399.26991: in run() - task 0e448fcc-3ce9-896b-2321-000000000086 11661 1726882399.27002: variable 'ansible_search_path' from source: unknown 11661 1726882399.27005: variable 'ansible_search_path' from source: unknown 11661 1726882399.27035: calling self._execute() 11661 1726882399.27118: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882399.27122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882399.27130: variable 'omit' from source: magic vars 11661 1726882399.27400: variable 'ansible_distribution_major_version' from source: facts 11661 1726882399.27410: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882399.27493: variable 'network_state' from source: role '' defaults 11661 1726882399.27501: Evaluated conditional (network_state != {}): False 11661 1726882399.27504: when evaluation is False, skipping this task 11661 1726882399.27507: _execute() done 11661 1726882399.27511: dumping result to json 11661 1726882399.27514: done dumping result, returning 11661 1726882399.27518: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-896b-2321-000000000086] 11661 1726882399.27530: sending task result for task 0e448fcc-3ce9-896b-2321-000000000086 11661 1726882399.27617: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000086 11661 1726882399.27620: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11661 1726882399.27670: no more pending results, returning what we have 11661 1726882399.27674: results queue empty 11661 1726882399.27675: checking for any_errors_fatal 11661 1726882399.27681: done checking for any_errors_fatal 11661 1726882399.27681: checking for max_fail_percentage 11661 1726882399.27683: done checking for max_fail_percentage 11661 1726882399.27684: checking to see if all hosts have failed and the running result is not ok 11661 1726882399.27684: done checking to see if all hosts have failed 11661 1726882399.27685: getting the remaining hosts for this loop 11661 1726882399.27687: done getting the remaining hosts for this loop 11661 1726882399.27690: getting the next task for host managed_node2 11661 1726882399.27697: done getting next task for host managed_node2 11661 1726882399.27701: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11661 1726882399.27705: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882399.27724: getting variables 11661 1726882399.27725: in VariableManager get_vars() 11661 1726882399.27758: Calling all_inventory to load vars for managed_node2 11661 1726882399.27760: Calling groups_inventory to load vars for managed_node2 11661 1726882399.27762: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882399.27772: Calling all_plugins_play to load vars for managed_node2 11661 1726882399.27775: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882399.27778: Calling groups_plugins_play to load vars for managed_node2 11661 1726882399.28668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882399.29590: done with get_vars() 11661 1726882399.29606: done getting variables 11661 1726882399.29647: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:33:19 -0400 (0:00:00.032) 0:00:28.010 ****** 11661 1726882399.29676: entering _queue_task() for managed_node2/service 11661 1726882399.29894: worker is 1 (out of 1 available) 11661 1726882399.29909: exiting _queue_task() for managed_node2/service 11661 1726882399.29921: done queuing things up, now waiting for results queue to drain 11661 1726882399.29922: waiting for pending results... 11661 1726882399.30111: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11661 1726882399.30210: in run() - task 0e448fcc-3ce9-896b-2321-000000000087 11661 1726882399.30221: variable 'ansible_search_path' from source: unknown 11661 1726882399.30224: variable 'ansible_search_path' from source: unknown 11661 1726882399.30257: calling self._execute() 11661 1726882399.30337: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882399.30341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882399.30349: variable 'omit' from source: magic vars 11661 1726882399.30628: variable 'ansible_distribution_major_version' from source: facts 11661 1726882399.30638: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882399.30722: variable '__network_wireless_connections_defined' from source: role '' defaults 11661 1726882399.30858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882399.32424: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882399.32470: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882399.32497: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882399.32525: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882399.32548: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882399.32606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882399.32640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882399.32659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882399.32688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882399.32700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882399.32730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882399.32751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882399.32773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882399.32798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882399.32809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882399.32837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882399.32860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882399.32885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882399.32910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882399.32920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882399.33033: variable 'network_connections' from source: task vars 11661 1726882399.33043: variable 'port2_profile' from source: play vars 11661 1726882399.33094: variable 'port2_profile' from source: play vars 11661 1726882399.33102: variable 'port1_profile' from source: play vars 11661 1726882399.33146: variable 'port1_profile' from source: play vars 11661 1726882399.33154: variable 'controller_profile' from source: play vars 11661 1726882399.33198: variable 'controller_profile' from source: play vars 11661 1726882399.33247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11661 1726882399.33357: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11661 1726882399.33385: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11661 1726882399.33411: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11661 1726882399.33432: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11661 1726882399.33463: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11661 1726882399.33480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11661 1726882399.33502: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882399.33518: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11661 1726882399.33561: variable '__network_team_connections_defined' from source: role '' defaults 11661 1726882399.33728: variable 'network_connections' from source: task vars 11661 1726882399.33731: variable 'port2_profile' from source: play vars 11661 1726882399.33777: variable 'port2_profile' from source: play vars 11661 1726882399.33784: variable 'port1_profile' from source: play vars 11661 1726882399.33825: variable 'port1_profile' from source: play vars 11661 1726882399.33834: variable 'controller_profile' from source: play vars 11661 1726882399.33881: variable 'controller_profile' from source: play vars 11661 1726882399.33899: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11661 1726882399.33910: when evaluation is False, skipping this task 11661 1726882399.33912: _execute() done 11661 1726882399.33915: dumping result to json 11661 1726882399.33917: done dumping result, returning 11661 1726882399.33919: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-896b-2321-000000000087] 11661 1726882399.33927: sending task result for task 0e448fcc-3ce9-896b-2321-000000000087 11661 1726882399.34032: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000087 11661 1726882399.34034: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11661 1726882399.34082: no more pending results, returning what we have 11661 1726882399.34086: results queue empty 11661 1726882399.34086: checking for any_errors_fatal 11661 1726882399.34094: done checking for any_errors_fatal 11661 1726882399.34095: checking for max_fail_percentage 11661 1726882399.34096: done checking for max_fail_percentage 11661 1726882399.34097: checking to see if all hosts have failed and the running result is not ok 11661 1726882399.34098: done checking to see if all hosts have failed 11661 1726882399.34099: getting the remaining hosts for this loop 11661 1726882399.34100: done getting the remaining hosts for this loop 11661 1726882399.34103: getting the next task for host managed_node2 11661 1726882399.34110: done getting next task for host managed_node2 11661 1726882399.34114: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11661 1726882399.34117: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882399.34140: getting variables 11661 1726882399.34142: in VariableManager get_vars() 11661 1726882399.34186: Calling all_inventory to load vars for managed_node2 11661 1726882399.34188: Calling groups_inventory to load vars for managed_node2 11661 1726882399.34191: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882399.34199: Calling all_plugins_play to load vars for managed_node2 11661 1726882399.34202: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882399.34204: Calling groups_plugins_play to load vars for managed_node2 11661 1726882399.35212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882399.36631: done with get_vars() 11661 1726882399.36657: done getting variables 11661 1726882399.36705: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:33:19 -0400 (0:00:00.070) 0:00:28.080 ****** 11661 1726882399.36732: entering _queue_task() for managed_node2/service 11661 1726882399.36970: worker is 1 (out of 1 available) 11661 1726882399.36983: exiting _queue_task() for managed_node2/service 11661 1726882399.36996: done queuing things up, now waiting for results queue to drain 11661 1726882399.36998: waiting for pending results... 11661 1726882399.37196: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11661 1726882399.37290: in run() - task 0e448fcc-3ce9-896b-2321-000000000088 11661 1726882399.37302: variable 'ansible_search_path' from source: unknown 11661 1726882399.37307: variable 'ansible_search_path' from source: unknown 11661 1726882399.37337: calling self._execute() 11661 1726882399.37419: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882399.37424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882399.37435: variable 'omit' from source: magic vars 11661 1726882399.37711: variable 'ansible_distribution_major_version' from source: facts 11661 1726882399.37721: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882399.37833: variable 'network_provider' from source: set_fact 11661 1726882399.37837: variable 'network_state' from source: role '' defaults 11661 1726882399.37845: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11661 1726882399.37854: variable 'omit' from source: magic vars 11661 1726882399.37903: variable 'omit' from source: magic vars 11661 1726882399.37923: variable 'network_service_name' from source: role '' defaults 11661 1726882399.37970: variable 'network_service_name' from source: role '' defaults 11661 1726882399.38067: variable '__network_provider_setup' from source: role '' defaults 11661 1726882399.38078: variable '__network_service_name_default_nm' from source: role '' defaults 11661 1726882399.38140: variable '__network_service_name_default_nm' from source: role '' defaults 11661 1726882399.38151: variable '__network_packages_default_nm' from source: role '' defaults 11661 1726882399.38212: variable '__network_packages_default_nm' from source: role '' defaults 11661 1726882399.38425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882399.41004: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882399.41081: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882399.41124: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882399.41162: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882399.41199: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882399.41282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882399.41315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882399.41343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882399.41391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882399.41410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882399.41456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882399.41486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882399.41514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882399.41556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882399.41576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882399.41823: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11661 1726882399.41943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882399.41974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882399.42001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882399.42042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882399.42060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882399.42159: variable 'ansible_python' from source: facts 11661 1726882399.42189: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11661 1726882399.42276: variable '__network_wpa_supplicant_required' from source: role '' defaults 11661 1726882399.42357: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11661 1726882399.42484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882399.42513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882399.42541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882399.42586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882399.42607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882399.42657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882399.42697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882399.42726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882399.42771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882399.42790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882399.42917: variable 'network_connections' from source: task vars 11661 1726882399.42930: variable 'port2_profile' from source: play vars 11661 1726882399.42997: variable 'port2_profile' from source: play vars 11661 1726882399.43012: variable 'port1_profile' from source: play vars 11661 1726882399.43082: variable 'port1_profile' from source: play vars 11661 1726882399.43101: variable 'controller_profile' from source: play vars 11661 1726882399.43178: variable 'controller_profile' from source: play vars 11661 1726882399.43286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11661 1726882399.43479: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11661 1726882399.43535: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11661 1726882399.43584: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11661 1726882399.43628: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11661 1726882399.43697: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11661 1726882399.43731: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11661 1726882399.43772: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882399.43811: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11661 1726882399.43868: variable '__network_wireless_connections_defined' from source: role '' defaults 11661 1726882399.44168: variable 'network_connections' from source: task vars 11661 1726882399.44181: variable 'port2_profile' from source: play vars 11661 1726882399.44256: variable 'port2_profile' from source: play vars 11661 1726882399.44277: variable 'port1_profile' from source: play vars 11661 1726882399.44349: variable 'port1_profile' from source: play vars 11661 1726882399.44367: variable 'controller_profile' from source: play vars 11661 1726882399.44438: variable 'controller_profile' from source: play vars 11661 1726882399.44478: variable '__network_packages_default_wireless' from source: role '' defaults 11661 1726882399.44560: variable '__network_wireless_connections_defined' from source: role '' defaults 11661 1726882399.44852: variable 'network_connections' from source: task vars 11661 1726882399.44862: variable 'port2_profile' from source: play vars 11661 1726882399.44936: variable 'port2_profile' from source: play vars 11661 1726882399.44948: variable 'port1_profile' from source: play vars 11661 1726882399.45019: variable 'port1_profile' from source: play vars 11661 1726882399.45032: variable 'controller_profile' from source: play vars 11661 1726882399.45106: variable 'controller_profile' from source: play vars 11661 1726882399.45134: variable '__network_packages_default_team' from source: role '' defaults 11661 1726882399.45215: variable '__network_team_connections_defined' from source: role '' defaults 11661 1726882399.45487: variable 'network_connections' from source: task vars 11661 1726882399.45495: variable 'port2_profile' from source: play vars 11661 1726882399.45565: variable 'port2_profile' from source: play vars 11661 1726882399.45578: variable 'port1_profile' from source: play vars 11661 1726882399.45642: variable 'port1_profile' from source: play vars 11661 1726882399.45653: variable 'controller_profile' from source: play vars 11661 1726882399.45723: variable 'controller_profile' from source: play vars 11661 1726882399.45784: variable '__network_service_name_default_initscripts' from source: role '' defaults 11661 1726882399.45847: variable '__network_service_name_default_initscripts' from source: role '' defaults 11661 1726882399.45857: variable '__network_packages_default_initscripts' from source: role '' defaults 11661 1726882399.45914: variable '__network_packages_default_initscripts' from source: role '' defaults 11661 1726882399.46141: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11661 1726882399.46636: variable 'network_connections' from source: task vars 11661 1726882399.46645: variable 'port2_profile' from source: play vars 11661 1726882399.46713: variable 'port2_profile' from source: play vars 11661 1726882399.46725: variable 'port1_profile' from source: play vars 11661 1726882399.46787: variable 'port1_profile' from source: play vars 11661 1726882399.46799: variable 'controller_profile' from source: play vars 11661 1726882399.46867: variable 'controller_profile' from source: play vars 11661 1726882399.46880: variable 'ansible_distribution' from source: facts 11661 1726882399.46887: variable '__network_rh_distros' from source: role '' defaults 11661 1726882399.46895: variable 'ansible_distribution_major_version' from source: facts 11661 1726882399.46913: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11661 1726882399.47094: variable 'ansible_distribution' from source: facts 11661 1726882399.47103: variable '__network_rh_distros' from source: role '' defaults 11661 1726882399.47111: variable 'ansible_distribution_major_version' from source: facts 11661 1726882399.47127: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11661 1726882399.47305: variable 'ansible_distribution' from source: facts 11661 1726882399.47314: variable '__network_rh_distros' from source: role '' defaults 11661 1726882399.47322: variable 'ansible_distribution_major_version' from source: facts 11661 1726882399.47367: variable 'network_provider' from source: set_fact 11661 1726882399.47393: variable 'omit' from source: magic vars 11661 1726882399.47425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882399.47456: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882399.47488: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882399.47509: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882399.47525: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882399.47556: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882399.47566: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882399.47575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882399.47682: Set connection var ansible_connection to ssh 11661 1726882399.47693: Set connection var ansible_pipelining to False 11661 1726882399.47702: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882399.47712: Set connection var ansible_timeout to 10 11661 1726882399.47718: Set connection var ansible_shell_type to sh 11661 1726882399.47728: Set connection var ansible_shell_executable to /bin/sh 11661 1726882399.47754: variable 'ansible_shell_executable' from source: unknown 11661 1726882399.47761: variable 'ansible_connection' from source: unknown 11661 1726882399.47769: variable 'ansible_module_compression' from source: unknown 11661 1726882399.47776: variable 'ansible_shell_type' from source: unknown 11661 1726882399.47783: variable 'ansible_shell_executable' from source: unknown 11661 1726882399.47791: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882399.47798: variable 'ansible_pipelining' from source: unknown 11661 1726882399.47804: variable 'ansible_timeout' from source: unknown 11661 1726882399.47811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882399.47920: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882399.47935: variable 'omit' from source: magic vars 11661 1726882399.47944: starting attempt loop 11661 1726882399.47949: running the handler 11661 1726882399.48032: variable 'ansible_facts' from source: unknown 11661 1726882399.48817: _low_level_execute_command(): starting 11661 1726882399.48830: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882399.49568: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882399.49585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882399.49601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882399.49620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882399.49670: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882399.49682: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882399.49696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882399.49712: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882399.49725: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882399.49734: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882399.49751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882399.49767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882399.49784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882399.49796: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882399.49807: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882399.49820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882399.49901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882399.49925: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882399.49943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882399.50085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882399.51765: stdout chunk (state=3): >>>/root <<< 11661 1726882399.51946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882399.51950: stdout chunk (state=3): >>><<< 11661 1726882399.51952: stderr chunk (state=3): >>><<< 11661 1726882399.52070: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882399.52076: _low_level_execute_command(): starting 11661 1726882399.52079: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882399.5197704-12921-214290860041602 `" && echo ansible-tmp-1726882399.5197704-12921-214290860041602="` echo /root/.ansible/tmp/ansible-tmp-1726882399.5197704-12921-214290860041602 `" ) && sleep 0' 11661 1726882399.52879: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882399.52916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882399.52920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882399.52922: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882399.52924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882399.53001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882399.53021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882399.53154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882399.55113: stdout chunk (state=3): >>>ansible-tmp-1726882399.5197704-12921-214290860041602=/root/.ansible/tmp/ansible-tmp-1726882399.5197704-12921-214290860041602 <<< 11661 1726882399.55212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882399.55306: stderr chunk (state=3): >>><<< 11661 1726882399.55310: stdout chunk (state=3): >>><<< 11661 1726882399.55397: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882399.5197704-12921-214290860041602=/root/.ansible/tmp/ansible-tmp-1726882399.5197704-12921-214290860041602 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882399.55406: variable 'ansible_module_compression' from source: unknown 11661 1726882399.55670: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11661 1726882399.55674: variable 'ansible_facts' from source: unknown 11661 1726882399.55676: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882399.5197704-12921-214290860041602/AnsiballZ_systemd.py 11661 1726882399.55868: Sending initial data 11661 1726882399.55872: Sent initial data (156 bytes) 11661 1726882399.56894: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882399.56905: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882399.56915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882399.56927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882399.56961: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882399.56996: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882399.56999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882399.57001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882399.57004: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882399.57059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882399.57066: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882399.57070: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882399.57167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882399.58973: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882399.59070: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882399.59162: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmp5f0ql87o /root/.ansible/tmp/ansible-tmp-1726882399.5197704-12921-214290860041602/AnsiballZ_systemd.py <<< 11661 1726882399.59265: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882399.61636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882399.61770: stderr chunk (state=3): >>><<< 11661 1726882399.61773: stdout chunk (state=3): >>><<< 11661 1726882399.61775: done transferring module to remote 11661 1726882399.61778: _low_level_execute_command(): starting 11661 1726882399.61780: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882399.5197704-12921-214290860041602/ /root/.ansible/tmp/ansible-tmp-1726882399.5197704-12921-214290860041602/AnsiballZ_systemd.py && sleep 0' 11661 1726882399.62327: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882399.62346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882399.62349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882399.62354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882399.62392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882399.62396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 11661 1726882399.62399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882399.62414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882399.62428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882399.62437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882399.62446: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882399.62461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882399.62545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882399.62570: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882399.62584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882399.62710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882399.64476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882399.64523: stderr chunk (state=3): >>><<< 11661 1726882399.64526: stdout chunk (state=3): >>><<< 11661 1726882399.64539: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882399.64542: _low_level_execute_command(): starting 11661 1726882399.64546: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882399.5197704-12921-214290860041602/AnsiballZ_systemd.py && sleep 0' 11661 1726882399.64986: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882399.64990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882399.65029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882399.65032: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882399.65034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882399.65088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882399.65095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882399.65097: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882399.65198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882399.90312: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 11661 1726882399.90349: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "8859648", "MemoryAvailable": "infinity", "CPUUsageNSec": "552525000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11661 1726882399.91982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882399.91986: stdout chunk (state=3): >>><<< 11661 1726882399.91989: stderr chunk (state=3): >>><<< 11661 1726882399.92274: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "8859648", "MemoryAvailable": "infinity", "CPUUsageNSec": "552525000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882399.92286: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882399.5197704-12921-214290860041602/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882399.92289: _low_level_execute_command(): starting 11661 1726882399.92291: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882399.5197704-12921-214290860041602/ > /dev/null 2>&1 && sleep 0' 11661 1726882399.92888: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882399.92904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882399.92922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882399.92945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882399.92993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882399.93005: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882399.93020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882399.93044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882399.93057: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882399.93072: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882399.93084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882399.93098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882399.93114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882399.93126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882399.93138: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882399.93159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882399.93240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882399.93273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882399.93292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882399.93424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882399.95321: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882399.95324: stdout chunk (state=3): >>><<< 11661 1726882399.95327: stderr chunk (state=3): >>><<< 11661 1726882399.95885: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882399.95889: handler run complete 11661 1726882399.95891: attempt loop complete, returning result 11661 1726882399.95893: _execute() done 11661 1726882399.95895: dumping result to json 11661 1726882399.95897: done dumping result, returning 11661 1726882399.95899: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-896b-2321-000000000088] 11661 1726882399.95901: sending task result for task 0e448fcc-3ce9-896b-2321-000000000088 11661 1726882399.96008: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000088 11661 1726882399.96011: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11661 1726882399.96060: no more pending results, returning what we have 11661 1726882399.96065: results queue empty 11661 1726882399.96066: checking for any_errors_fatal 11661 1726882399.96071: done checking for any_errors_fatal 11661 1726882399.96072: checking for max_fail_percentage 11661 1726882399.96074: done checking for max_fail_percentage 11661 1726882399.96075: checking to see if all hosts have failed and the running result is not ok 11661 1726882399.96076: done checking to see if all hosts have failed 11661 1726882399.96076: getting the remaining hosts for this loop 11661 1726882399.96078: done getting the remaining hosts for this loop 11661 1726882399.96081: getting the next task for host managed_node2 11661 1726882399.96087: done getting next task for host managed_node2 11661 1726882399.96091: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11661 1726882399.96094: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882399.96105: getting variables 11661 1726882399.96106: in VariableManager get_vars() 11661 1726882399.96144: Calling all_inventory to load vars for managed_node2 11661 1726882399.96146: Calling groups_inventory to load vars for managed_node2 11661 1726882399.96149: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882399.96158: Calling all_plugins_play to load vars for managed_node2 11661 1726882399.96162: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882399.96167: Calling groups_plugins_play to load vars for managed_node2 11661 1726882399.97745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882399.99545: done with get_vars() 11661 1726882399.99580: done getting variables 11661 1726882399.99649: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:33:19 -0400 (0:00:00.629) 0:00:28.710 ****** 11661 1726882399.99687: entering _queue_task() for managed_node2/service 11661 1726882400.00023: worker is 1 (out of 1 available) 11661 1726882400.00035: exiting _queue_task() for managed_node2/service 11661 1726882400.00048: done queuing things up, now waiting for results queue to drain 11661 1726882400.00050: waiting for pending results... 11661 1726882400.00341: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11661 1726882400.00512: in run() - task 0e448fcc-3ce9-896b-2321-000000000089 11661 1726882400.00532: variable 'ansible_search_path' from source: unknown 11661 1726882400.00539: variable 'ansible_search_path' from source: unknown 11661 1726882400.00580: calling self._execute() 11661 1726882400.00686: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882400.00697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882400.00716: variable 'omit' from source: magic vars 11661 1726882400.01075: variable 'ansible_distribution_major_version' from source: facts 11661 1726882400.01093: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882400.01214: variable 'network_provider' from source: set_fact 11661 1726882400.01224: Evaluated conditional (network_provider == "nm"): True 11661 1726882400.01320: variable '__network_wpa_supplicant_required' from source: role '' defaults 11661 1726882400.01415: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11661 1726882400.01592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882400.03879: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882400.04060: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882400.04109: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882400.04154: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882400.04192: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882400.04298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882400.04332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882400.04369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882400.04415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882400.04433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882400.04488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882400.04515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882400.04543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882400.04591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882400.04613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882400.04658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882400.04692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882400.04715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882400.04752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882400.04771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882400.04931: variable 'network_connections' from source: task vars 11661 1726882400.04947: variable 'port2_profile' from source: play vars 11661 1726882400.05026: variable 'port2_profile' from source: play vars 11661 1726882400.05040: variable 'port1_profile' from source: play vars 11661 1726882400.05104: variable 'port1_profile' from source: play vars 11661 1726882400.05120: variable 'controller_profile' from source: play vars 11661 1726882400.05181: variable 'controller_profile' from source: play vars 11661 1726882400.05260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11661 1726882400.05434: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11661 1726882400.05482: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11661 1726882400.05517: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11661 1726882400.05552: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11661 1726882400.05600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11661 1726882400.05626: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11661 1726882400.05663: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882400.05696: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11661 1726882400.05750: variable '__network_wireless_connections_defined' from source: role '' defaults 11661 1726882400.06017: variable 'network_connections' from source: task vars 11661 1726882400.06027: variable 'port2_profile' from source: play vars 11661 1726882400.06093: variable 'port2_profile' from source: play vars 11661 1726882400.06105: variable 'port1_profile' from source: play vars 11661 1726882400.06169: variable 'port1_profile' from source: play vars 11661 1726882400.06190: variable 'controller_profile' from source: play vars 11661 1726882400.06254: variable 'controller_profile' from source: play vars 11661 1726882400.06290: Evaluated conditional (__network_wpa_supplicant_required): False 11661 1726882400.06297: when evaluation is False, skipping this task 11661 1726882400.06307: _execute() done 11661 1726882400.06314: dumping result to json 11661 1726882400.06321: done dumping result, returning 11661 1726882400.06331: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-896b-2321-000000000089] 11661 1726882400.06340: sending task result for task 0e448fcc-3ce9-896b-2321-000000000089 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11661 1726882400.06488: no more pending results, returning what we have 11661 1726882400.06492: results queue empty 11661 1726882400.06493: checking for any_errors_fatal 11661 1726882400.06511: done checking for any_errors_fatal 11661 1726882400.06512: checking for max_fail_percentage 11661 1726882400.06514: done checking for max_fail_percentage 11661 1726882400.06515: checking to see if all hosts have failed and the running result is not ok 11661 1726882400.06516: done checking to see if all hosts have failed 11661 1726882400.06517: getting the remaining hosts for this loop 11661 1726882400.06518: done getting the remaining hosts for this loop 11661 1726882400.06522: getting the next task for host managed_node2 11661 1726882400.06530: done getting next task for host managed_node2 11661 1726882400.06534: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11661 1726882400.06538: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882400.06557: getting variables 11661 1726882400.06559: in VariableManager get_vars() 11661 1726882400.06602: Calling all_inventory to load vars for managed_node2 11661 1726882400.06605: Calling groups_inventory to load vars for managed_node2 11661 1726882400.06608: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882400.06617: Calling all_plugins_play to load vars for managed_node2 11661 1726882400.06620: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882400.06624: Calling groups_plugins_play to load vars for managed_node2 11661 1726882400.07583: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000089 11661 1726882400.07587: WORKER PROCESS EXITING 11661 1726882400.08321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882400.18070: done with get_vars() 11661 1726882400.18101: done getting variables 11661 1726882400.18149: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:33:20 -0400 (0:00:00.185) 0:00:28.895 ****** 11661 1726882400.18204: entering _queue_task() for managed_node2/service 11661 1726882400.18650: worker is 1 (out of 1 available) 11661 1726882400.18665: exiting _queue_task() for managed_node2/service 11661 1726882400.18678: done queuing things up, now waiting for results queue to drain 11661 1726882400.18680: waiting for pending results... 11661 1726882400.19000: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 11661 1726882400.19402: in run() - task 0e448fcc-3ce9-896b-2321-00000000008a 11661 1726882400.19421: variable 'ansible_search_path' from source: unknown 11661 1726882400.19429: variable 'ansible_search_path' from source: unknown 11661 1726882400.19474: calling self._execute() 11661 1726882400.19579: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882400.19594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882400.19608: variable 'omit' from source: magic vars 11661 1726882400.20011: variable 'ansible_distribution_major_version' from source: facts 11661 1726882400.20035: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882400.20159: variable 'network_provider' from source: set_fact 11661 1726882400.20174: Evaluated conditional (network_provider == "initscripts"): False 11661 1726882400.20183: when evaluation is False, skipping this task 11661 1726882400.20189: _execute() done 11661 1726882400.20196: dumping result to json 11661 1726882400.20202: done dumping result, returning 11661 1726882400.20212: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-896b-2321-00000000008a] 11661 1726882400.20221: sending task result for task 0e448fcc-3ce9-896b-2321-00000000008a skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11661 1726882400.20390: no more pending results, returning what we have 11661 1726882400.20395: results queue empty 11661 1726882400.20396: checking for any_errors_fatal 11661 1726882400.20406: done checking for any_errors_fatal 11661 1726882400.20406: checking for max_fail_percentage 11661 1726882400.20409: done checking for max_fail_percentage 11661 1726882400.20409: checking to see if all hosts have failed and the running result is not ok 11661 1726882400.20410: done checking to see if all hosts have failed 11661 1726882400.20411: getting the remaining hosts for this loop 11661 1726882400.20413: done getting the remaining hosts for this loop 11661 1726882400.20416: getting the next task for host managed_node2 11661 1726882400.20424: done getting next task for host managed_node2 11661 1726882400.20429: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11661 1726882400.20433: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882400.20454: getting variables 11661 1726882400.20456: in VariableManager get_vars() 11661 1726882400.20502: Calling all_inventory to load vars for managed_node2 11661 1726882400.20506: Calling groups_inventory to load vars for managed_node2 11661 1726882400.20508: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882400.20520: Calling all_plugins_play to load vars for managed_node2 11661 1726882400.20523: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882400.20526: Calling groups_plugins_play to load vars for managed_node2 11661 1726882400.21599: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000008a 11661 1726882400.21603: WORKER PROCESS EXITING 11661 1726882400.22279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882400.24343: done with get_vars() 11661 1726882400.24372: done getting variables 11661 1726882400.24431: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:33:20 -0400 (0:00:00.062) 0:00:28.958 ****** 11661 1726882400.24470: entering _queue_task() for managed_node2/copy 11661 1726882400.25302: worker is 1 (out of 1 available) 11661 1726882400.25316: exiting _queue_task() for managed_node2/copy 11661 1726882400.25330: done queuing things up, now waiting for results queue to drain 11661 1726882400.25331: waiting for pending results... 11661 1726882400.25692: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11661 1726882400.25876: in run() - task 0e448fcc-3ce9-896b-2321-00000000008b 11661 1726882400.25899: variable 'ansible_search_path' from source: unknown 11661 1726882400.25908: variable 'ansible_search_path' from source: unknown 11661 1726882400.25958: calling self._execute() 11661 1726882400.26074: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882400.26089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882400.26108: variable 'omit' from source: magic vars 11661 1726882400.26532: variable 'ansible_distribution_major_version' from source: facts 11661 1726882400.26557: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882400.26692: variable 'network_provider' from source: set_fact 11661 1726882400.26710: Evaluated conditional (network_provider == "initscripts"): False 11661 1726882400.26718: when evaluation is False, skipping this task 11661 1726882400.26726: _execute() done 11661 1726882400.26733: dumping result to json 11661 1726882400.26741: done dumping result, returning 11661 1726882400.26756: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-896b-2321-00000000008b] 11661 1726882400.26772: sending task result for task 0e448fcc-3ce9-896b-2321-00000000008b skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11661 1726882400.26942: no more pending results, returning what we have 11661 1726882400.26946: results queue empty 11661 1726882400.26947: checking for any_errors_fatal 11661 1726882400.26958: done checking for any_errors_fatal 11661 1726882400.26958: checking for max_fail_percentage 11661 1726882400.26960: done checking for max_fail_percentage 11661 1726882400.26962: checking to see if all hosts have failed and the running result is not ok 11661 1726882400.26962: done checking to see if all hosts have failed 11661 1726882400.26965: getting the remaining hosts for this loop 11661 1726882400.26966: done getting the remaining hosts for this loop 11661 1726882400.26970: getting the next task for host managed_node2 11661 1726882400.26979: done getting next task for host managed_node2 11661 1726882400.26986: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11661 1726882400.26990: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882400.27015: getting variables 11661 1726882400.27017: in VariableManager get_vars() 11661 1726882400.27062: Calling all_inventory to load vars for managed_node2 11661 1726882400.27066: Calling groups_inventory to load vars for managed_node2 11661 1726882400.27070: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882400.27083: Calling all_plugins_play to load vars for managed_node2 11661 1726882400.27087: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882400.27091: Calling groups_plugins_play to load vars for managed_node2 11661 1726882400.28749: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000008b 11661 1726882400.28756: WORKER PROCESS EXITING 11661 1726882400.30144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882400.32431: done with get_vars() 11661 1726882400.32471: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:33:20 -0400 (0:00:00.081) 0:00:29.039 ****** 11661 1726882400.32578: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 11661 1726882400.33072: worker is 1 (out of 1 available) 11661 1726882400.33084: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 11661 1726882400.33095: done queuing things up, now waiting for results queue to drain 11661 1726882400.33097: waiting for pending results... 11661 1726882400.33404: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11661 1726882400.33604: in run() - task 0e448fcc-3ce9-896b-2321-00000000008c 11661 1726882400.33624: variable 'ansible_search_path' from source: unknown 11661 1726882400.33632: variable 'ansible_search_path' from source: unknown 11661 1726882400.33682: calling self._execute() 11661 1726882400.33793: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882400.33805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882400.33821: variable 'omit' from source: magic vars 11661 1726882400.34241: variable 'ansible_distribution_major_version' from source: facts 11661 1726882400.34265: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882400.34277: variable 'omit' from source: magic vars 11661 1726882400.34354: variable 'omit' from source: magic vars 11661 1726882400.34535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11661 1726882400.37036: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11661 1726882400.37123: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11661 1726882400.37176: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11661 1726882400.37220: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11661 1726882400.37262: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11661 1726882400.37362: variable 'network_provider' from source: set_fact 11661 1726882400.37518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11661 1726882400.37557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11661 1726882400.37595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11661 1726882400.37640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11661 1726882400.37669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11661 1726882400.37755: variable 'omit' from source: magic vars 11661 1726882400.37897: variable 'omit' from source: magic vars 11661 1726882400.38017: variable 'network_connections' from source: task vars 11661 1726882400.38033: variable 'port2_profile' from source: play vars 11661 1726882400.38105: variable 'port2_profile' from source: play vars 11661 1726882400.38123: variable 'port1_profile' from source: play vars 11661 1726882400.38188: variable 'port1_profile' from source: play vars 11661 1726882400.38203: variable 'controller_profile' from source: play vars 11661 1726882400.38275: variable 'controller_profile' from source: play vars 11661 1726882400.38456: variable 'omit' from source: magic vars 11661 1726882400.38472: variable '__lsr_ansible_managed' from source: task vars 11661 1726882400.38534: variable '__lsr_ansible_managed' from source: task vars 11661 1726882400.38739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11661 1726882400.38992: Loaded config def from plugin (lookup/template) 11661 1726882400.39002: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11661 1726882400.39034: File lookup term: get_ansible_managed.j2 11661 1726882400.39042: variable 'ansible_search_path' from source: unknown 11661 1726882400.39054: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11661 1726882400.39075: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11661 1726882400.39105: variable 'ansible_search_path' from source: unknown 11661 1726882400.46068: variable 'ansible_managed' from source: unknown 11661 1726882400.46225: variable 'omit' from source: magic vars 11661 1726882400.46273: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882400.46303: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882400.46324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882400.46349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882400.46372: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882400.46405: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882400.46413: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882400.46419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882400.46523: Set connection var ansible_connection to ssh 11661 1726882400.46533: Set connection var ansible_pipelining to False 11661 1726882400.46543: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882400.46562: Set connection var ansible_timeout to 10 11661 1726882400.46571: Set connection var ansible_shell_type to sh 11661 1726882400.46587: Set connection var ansible_shell_executable to /bin/sh 11661 1726882400.46611: variable 'ansible_shell_executable' from source: unknown 11661 1726882400.46618: variable 'ansible_connection' from source: unknown 11661 1726882400.46624: variable 'ansible_module_compression' from source: unknown 11661 1726882400.46629: variable 'ansible_shell_type' from source: unknown 11661 1726882400.46635: variable 'ansible_shell_executable' from source: unknown 11661 1726882400.46640: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882400.46646: variable 'ansible_pipelining' from source: unknown 11661 1726882400.46655: variable 'ansible_timeout' from source: unknown 11661 1726882400.46673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882400.46815: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11661 1726882400.46830: variable 'omit' from source: magic vars 11661 1726882400.46840: starting attempt loop 11661 1726882400.46846: running the handler 11661 1726882400.46868: _low_level_execute_command(): starting 11661 1726882400.46879: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882400.47659: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882400.47682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882400.47697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882400.47713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882400.47757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882400.47771: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882400.47791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882400.47808: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882400.47819: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882400.47829: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882400.47839: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882400.47853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882400.47872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882400.47885: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882400.47898: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882400.47911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882400.47991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882400.48019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882400.48034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882400.48175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882400.49872: stdout chunk (state=3): >>>/root <<< 11661 1726882400.50057: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882400.50060: stdout chunk (state=3): >>><<< 11661 1726882400.50063: stderr chunk (state=3): >>><<< 11661 1726882400.50171: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882400.50175: _low_level_execute_command(): starting 11661 1726882400.50178: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882400.5008385-12962-66061989820195 `" && echo ansible-tmp-1726882400.5008385-12962-66061989820195="` echo /root/.ansible/tmp/ansible-tmp-1726882400.5008385-12962-66061989820195 `" ) && sleep 0' 11661 1726882400.50867: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882400.50985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882400.50999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882400.51015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882400.51059: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882400.51074: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882400.51087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882400.51103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882400.51113: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882400.51123: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882400.51134: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882400.51148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882400.51170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882400.51182: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882400.51191: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882400.51203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882400.51282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882400.51654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882400.51673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882400.51804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882400.53698: stdout chunk (state=3): >>>ansible-tmp-1726882400.5008385-12962-66061989820195=/root/.ansible/tmp/ansible-tmp-1726882400.5008385-12962-66061989820195 <<< 11661 1726882400.53896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882400.53900: stdout chunk (state=3): >>><<< 11661 1726882400.53903: stderr chunk (state=3): >>><<< 11661 1726882400.53971: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882400.5008385-12962-66061989820195=/root/.ansible/tmp/ansible-tmp-1726882400.5008385-12962-66061989820195 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882400.54261: variable 'ansible_module_compression' from source: unknown 11661 1726882400.54267: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 11661 1726882400.54270: variable 'ansible_facts' from source: unknown 11661 1726882400.54272: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882400.5008385-12962-66061989820195/AnsiballZ_network_connections.py 11661 1726882400.54854: Sending initial data 11661 1726882400.54885: Sent initial data (167 bytes) 11661 1726882400.57926: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882400.57931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882400.57980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882400.57984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882400.58002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882400.58007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882400.58085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882400.58098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882400.58110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882400.58232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882400.59997: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882400.60101: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882400.60207: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpy5igikta /root/.ansible/tmp/ansible-tmp-1726882400.5008385-12962-66061989820195/AnsiballZ_network_connections.py <<< 11661 1726882400.60303: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882400.62373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882400.62510: stderr chunk (state=3): >>><<< 11661 1726882400.62514: stdout chunk (state=3): >>><<< 11661 1726882400.62536: done transferring module to remote 11661 1726882400.62547: _low_level_execute_command(): starting 11661 1726882400.62555: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882400.5008385-12962-66061989820195/ /root/.ansible/tmp/ansible-tmp-1726882400.5008385-12962-66061989820195/AnsiballZ_network_connections.py && sleep 0' 11661 1726882400.63686: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882400.63700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882400.63714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882400.63732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882400.63776: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882400.63790: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882400.63805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882400.63821: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882400.63832: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882400.63842: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882400.63856: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882400.63872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882400.63887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882400.63899: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882400.63911: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882400.63923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882400.63995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882400.64021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882400.64038: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882400.64172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882400.66035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882400.66039: stdout chunk (state=3): >>><<< 11661 1726882400.66042: stderr chunk (state=3): >>><<< 11661 1726882400.66142: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882400.66146: _low_level_execute_command(): starting 11661 1726882400.66148: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882400.5008385-12962-66061989820195/AnsiballZ_network_connections.py && sleep 0' 11661 1726882400.66977: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882400.66985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882400.67011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882400.67015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882400.67017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882400.67088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882400.67101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882400.67237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882401.14220: stdout chunk (state=3): >>>Traceback (most recent call last):<<< 11661 1726882401.14227: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_aw_w12yc/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_aw_w12yc/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 11661 1726882401.14246: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/1c852c52-21d3-4d36-862f-c83a68bc1805: error=unknown <<< 11661 1726882401.16088: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 11661 1726882401.16094: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_aw_w12yc/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 11661 1726882401.16100: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_aw_w12yc/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 11661 1726882401.16130: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/6d0f0ab8-7234-4420-a06e-101a4b2a8221: error=unknown<<< 11661 1726882401.16135: stdout chunk (state=3): >>> <<< 11661 1726882401.17958: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 11661 1726882401.17970: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_aw_w12yc/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 11661 1726882401.17990: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_aw_w12yc/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 11661 1726882401.18004: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/6479a90a-8c97-4625-ab42-bedea857a927: error=unknown <<< 11661 1726882401.18230: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11661 1726882401.19845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882401.19906: stderr chunk (state=3): >>><<< 11661 1726882401.19909: stdout chunk (state=3): >>><<< 11661 1726882401.19929: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_aw_w12yc/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_aw_w12yc/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/1c852c52-21d3-4d36-862f-c83a68bc1805: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_aw_w12yc/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_aw_w12yc/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/6d0f0ab8-7234-4420-a06e-101a4b2a8221: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_aw_w12yc/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_aw_w12yc/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/6479a90a-8c97-4625-ab42-bedea857a927: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882401.19971: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882400.5008385-12962-66061989820195/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882401.19978: _low_level_execute_command(): starting 11661 1726882401.19983: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882400.5008385-12962-66061989820195/ > /dev/null 2>&1 && sleep 0' 11661 1726882401.20436: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882401.20441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882401.20477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882401.20490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882401.20500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882401.20548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882401.20556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882401.20569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882401.20683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882401.22531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882401.22584: stderr chunk (state=3): >>><<< 11661 1726882401.22587: stdout chunk (state=3): >>><<< 11661 1726882401.22600: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882401.22606: handler run complete 11661 1726882401.22633: attempt loop complete, returning result 11661 1726882401.22636: _execute() done 11661 1726882401.22638: dumping result to json 11661 1726882401.22643: done dumping result, returning 11661 1726882401.22651: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-896b-2321-00000000008c] 11661 1726882401.22659: sending task result for task 0e448fcc-3ce9-896b-2321-00000000008c 11661 1726882401.22762: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000008c 11661 1726882401.22768: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 11661 1726882401.22872: no more pending results, returning what we have 11661 1726882401.22875: results queue empty 11661 1726882401.22876: checking for any_errors_fatal 11661 1726882401.22883: done checking for any_errors_fatal 11661 1726882401.22883: checking for max_fail_percentage 11661 1726882401.22885: done checking for max_fail_percentage 11661 1726882401.22886: checking to see if all hosts have failed and the running result is not ok 11661 1726882401.22887: done checking to see if all hosts have failed 11661 1726882401.22887: getting the remaining hosts for this loop 11661 1726882401.22889: done getting the remaining hosts for this loop 11661 1726882401.22892: getting the next task for host managed_node2 11661 1726882401.22899: done getting next task for host managed_node2 11661 1726882401.22903: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11661 1726882401.22906: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882401.22916: getting variables 11661 1726882401.22917: in VariableManager get_vars() 11661 1726882401.22953: Calling all_inventory to load vars for managed_node2 11661 1726882401.22956: Calling groups_inventory to load vars for managed_node2 11661 1726882401.22958: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882401.22968: Calling all_plugins_play to load vars for managed_node2 11661 1726882401.22975: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882401.22978: Calling groups_plugins_play to load vars for managed_node2 11661 1726882401.23800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882401.24783: done with get_vars() 11661 1726882401.24801: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:33:21 -0400 (0:00:00.922) 0:00:29.962 ****** 11661 1726882401.24870: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 11661 1726882401.25107: worker is 1 (out of 1 available) 11661 1726882401.25122: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 11661 1726882401.25135: done queuing things up, now waiting for results queue to drain 11661 1726882401.25136: waiting for pending results... 11661 1726882401.25319: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 11661 1726882401.25414: in run() - task 0e448fcc-3ce9-896b-2321-00000000008d 11661 1726882401.25425: variable 'ansible_search_path' from source: unknown 11661 1726882401.25428: variable 'ansible_search_path' from source: unknown 11661 1726882401.25455: calling self._execute() 11661 1726882401.25537: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882401.25541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882401.25553: variable 'omit' from source: magic vars 11661 1726882401.25842: variable 'ansible_distribution_major_version' from source: facts 11661 1726882401.25854: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882401.25942: variable 'network_state' from source: role '' defaults 11661 1726882401.25952: Evaluated conditional (network_state != {}): False 11661 1726882401.25957: when evaluation is False, skipping this task 11661 1726882401.25960: _execute() done 11661 1726882401.25963: dumping result to json 11661 1726882401.25968: done dumping result, returning 11661 1726882401.25975: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-896b-2321-00000000008d] 11661 1726882401.25980: sending task result for task 0e448fcc-3ce9-896b-2321-00000000008d 11661 1726882401.26066: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000008d 11661 1726882401.26068: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11661 1726882401.26114: no more pending results, returning what we have 11661 1726882401.26118: results queue empty 11661 1726882401.26119: checking for any_errors_fatal 11661 1726882401.26135: done checking for any_errors_fatal 11661 1726882401.26136: checking for max_fail_percentage 11661 1726882401.26138: done checking for max_fail_percentage 11661 1726882401.26139: checking to see if all hosts have failed and the running result is not ok 11661 1726882401.26139: done checking to see if all hosts have failed 11661 1726882401.26140: getting the remaining hosts for this loop 11661 1726882401.26142: done getting the remaining hosts for this loop 11661 1726882401.26145: getting the next task for host managed_node2 11661 1726882401.26152: done getting next task for host managed_node2 11661 1726882401.26156: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11661 1726882401.26164: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882401.26185: getting variables 11661 1726882401.26186: in VariableManager get_vars() 11661 1726882401.26220: Calling all_inventory to load vars for managed_node2 11661 1726882401.26223: Calling groups_inventory to load vars for managed_node2 11661 1726882401.26225: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882401.26234: Calling all_plugins_play to load vars for managed_node2 11661 1726882401.26236: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882401.26239: Calling groups_plugins_play to load vars for managed_node2 11661 1726882401.27262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882401.28626: done with get_vars() 11661 1726882401.28648: done getting variables 11661 1726882401.28699: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:33:21 -0400 (0:00:00.038) 0:00:30.000 ****** 11661 1726882401.28726: entering _queue_task() for managed_node2/debug 11661 1726882401.28955: worker is 1 (out of 1 available) 11661 1726882401.28971: exiting _queue_task() for managed_node2/debug 11661 1726882401.28983: done queuing things up, now waiting for results queue to drain 11661 1726882401.28985: waiting for pending results... 11661 1726882401.29160: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11661 1726882401.29262: in run() - task 0e448fcc-3ce9-896b-2321-00000000008e 11661 1726882401.29279: variable 'ansible_search_path' from source: unknown 11661 1726882401.29283: variable 'ansible_search_path' from source: unknown 11661 1726882401.29311: calling self._execute() 11661 1726882401.29393: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882401.29397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882401.29407: variable 'omit' from source: magic vars 11661 1726882401.29691: variable 'ansible_distribution_major_version' from source: facts 11661 1726882401.29701: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882401.29708: variable 'omit' from source: magic vars 11661 1726882401.29755: variable 'omit' from source: magic vars 11661 1726882401.29782: variable 'omit' from source: magic vars 11661 1726882401.29815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882401.29843: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882401.29865: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882401.29882: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882401.29892: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882401.29915: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882401.29918: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882401.29920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882401.29992: Set connection var ansible_connection to ssh 11661 1726882401.29996: Set connection var ansible_pipelining to False 11661 1726882401.30002: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882401.30008: Set connection var ansible_timeout to 10 11661 1726882401.30011: Set connection var ansible_shell_type to sh 11661 1726882401.30017: Set connection var ansible_shell_executable to /bin/sh 11661 1726882401.30033: variable 'ansible_shell_executable' from source: unknown 11661 1726882401.30036: variable 'ansible_connection' from source: unknown 11661 1726882401.30039: variable 'ansible_module_compression' from source: unknown 11661 1726882401.30042: variable 'ansible_shell_type' from source: unknown 11661 1726882401.30044: variable 'ansible_shell_executable' from source: unknown 11661 1726882401.30046: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882401.30048: variable 'ansible_pipelining' from source: unknown 11661 1726882401.30050: variable 'ansible_timeout' from source: unknown 11661 1726882401.30055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882401.30154: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882401.30162: variable 'omit' from source: magic vars 11661 1726882401.30168: starting attempt loop 11661 1726882401.30172: running the handler 11661 1726882401.30265: variable '__network_connections_result' from source: set_fact 11661 1726882401.30308: handler run complete 11661 1726882401.30319: attempt loop complete, returning result 11661 1726882401.30322: _execute() done 11661 1726882401.30325: dumping result to json 11661 1726882401.30327: done dumping result, returning 11661 1726882401.30335: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-896b-2321-00000000008e] 11661 1726882401.30340: sending task result for task 0e448fcc-3ce9-896b-2321-00000000008e 11661 1726882401.30428: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000008e 11661 1726882401.30431: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 11661 1726882401.30494: no more pending results, returning what we have 11661 1726882401.30497: results queue empty 11661 1726882401.30498: checking for any_errors_fatal 11661 1726882401.30504: done checking for any_errors_fatal 11661 1726882401.30505: checking for max_fail_percentage 11661 1726882401.30506: done checking for max_fail_percentage 11661 1726882401.30507: checking to see if all hosts have failed and the running result is not ok 11661 1726882401.30508: done checking to see if all hosts have failed 11661 1726882401.30509: getting the remaining hosts for this loop 11661 1726882401.30510: done getting the remaining hosts for this loop 11661 1726882401.30514: getting the next task for host managed_node2 11661 1726882401.30525: done getting next task for host managed_node2 11661 1726882401.30530: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11661 1726882401.30533: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882401.30543: getting variables 11661 1726882401.30545: in VariableManager get_vars() 11661 1726882401.30582: Calling all_inventory to load vars for managed_node2 11661 1726882401.30585: Calling groups_inventory to load vars for managed_node2 11661 1726882401.30587: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882401.30595: Calling all_plugins_play to load vars for managed_node2 11661 1726882401.30598: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882401.30600: Calling groups_plugins_play to load vars for managed_node2 11661 1726882401.31435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882401.32923: done with get_vars() 11661 1726882401.32956: done getting variables 11661 1726882401.33023: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:33:21 -0400 (0:00:00.043) 0:00:30.044 ****** 11661 1726882401.33139: entering _queue_task() for managed_node2/debug 11661 1726882401.33656: worker is 1 (out of 1 available) 11661 1726882401.33671: exiting _queue_task() for managed_node2/debug 11661 1726882401.33691: done queuing things up, now waiting for results queue to drain 11661 1726882401.33693: waiting for pending results... 11661 1726882401.33848: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11661 1726882401.33955: in run() - task 0e448fcc-3ce9-896b-2321-00000000008f 11661 1726882401.33966: variable 'ansible_search_path' from source: unknown 11661 1726882401.33969: variable 'ansible_search_path' from source: unknown 11661 1726882401.34003: calling self._execute() 11661 1726882401.34087: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882401.34100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882401.34103: variable 'omit' from source: magic vars 11661 1726882401.34386: variable 'ansible_distribution_major_version' from source: facts 11661 1726882401.34396: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882401.34403: variable 'omit' from source: magic vars 11661 1726882401.34449: variable 'omit' from source: magic vars 11661 1726882401.34481: variable 'omit' from source: magic vars 11661 1726882401.34517: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882401.34544: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882401.34563: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882401.34578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882401.34588: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882401.34611: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882401.34615: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882401.34617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882401.34699: Set connection var ansible_connection to ssh 11661 1726882401.34704: Set connection var ansible_pipelining to False 11661 1726882401.34707: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882401.34715: Set connection var ansible_timeout to 10 11661 1726882401.34718: Set connection var ansible_shell_type to sh 11661 1726882401.34724: Set connection var ansible_shell_executable to /bin/sh 11661 1726882401.34740: variable 'ansible_shell_executable' from source: unknown 11661 1726882401.34743: variable 'ansible_connection' from source: unknown 11661 1726882401.34746: variable 'ansible_module_compression' from source: unknown 11661 1726882401.34748: variable 'ansible_shell_type' from source: unknown 11661 1726882401.34750: variable 'ansible_shell_executable' from source: unknown 11661 1726882401.34755: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882401.34757: variable 'ansible_pipelining' from source: unknown 11661 1726882401.34759: variable 'ansible_timeout' from source: unknown 11661 1726882401.34762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882401.34859: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882401.34871: variable 'omit' from source: magic vars 11661 1726882401.34880: starting attempt loop 11661 1726882401.34883: running the handler 11661 1726882401.34916: variable '__network_connections_result' from source: set_fact 11661 1726882401.34972: variable '__network_connections_result' from source: set_fact 11661 1726882401.35058: handler run complete 11661 1726882401.35078: attempt loop complete, returning result 11661 1726882401.35081: _execute() done 11661 1726882401.35083: dumping result to json 11661 1726882401.35086: done dumping result, returning 11661 1726882401.35095: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-896b-2321-00000000008f] 11661 1726882401.35104: sending task result for task 0e448fcc-3ce9-896b-2321-00000000008f 11661 1726882401.35195: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000008f 11661 1726882401.35197: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 11661 1726882401.35324: no more pending results, returning what we have 11661 1726882401.35327: results queue empty 11661 1726882401.35328: checking for any_errors_fatal 11661 1726882401.35337: done checking for any_errors_fatal 11661 1726882401.35338: checking for max_fail_percentage 11661 1726882401.35339: done checking for max_fail_percentage 11661 1726882401.35340: checking to see if all hosts have failed and the running result is not ok 11661 1726882401.35341: done checking to see if all hosts have failed 11661 1726882401.35342: getting the remaining hosts for this loop 11661 1726882401.35343: done getting the remaining hosts for this loop 11661 1726882401.35347: getting the next task for host managed_node2 11661 1726882401.35355: done getting next task for host managed_node2 11661 1726882401.35359: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11661 1726882401.35362: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882401.35376: getting variables 11661 1726882401.35378: in VariableManager get_vars() 11661 1726882401.35419: Calling all_inventory to load vars for managed_node2 11661 1726882401.35422: Calling groups_inventory to load vars for managed_node2 11661 1726882401.35425: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882401.35435: Calling all_plugins_play to load vars for managed_node2 11661 1726882401.35445: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882401.35449: Calling groups_plugins_play to load vars for managed_node2 11661 1726882401.36845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882401.38546: done with get_vars() 11661 1726882401.38579: done getting variables 11661 1726882401.38640: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:33:21 -0400 (0:00:00.056) 0:00:30.100 ****** 11661 1726882401.38679: entering _queue_task() for managed_node2/debug 11661 1726882401.38994: worker is 1 (out of 1 available) 11661 1726882401.39008: exiting _queue_task() for managed_node2/debug 11661 1726882401.39019: done queuing things up, now waiting for results queue to drain 11661 1726882401.39021: waiting for pending results... 11661 1726882401.39311: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11661 1726882401.39476: in run() - task 0e448fcc-3ce9-896b-2321-000000000090 11661 1726882401.39498: variable 'ansible_search_path' from source: unknown 11661 1726882401.39505: variable 'ansible_search_path' from source: unknown 11661 1726882401.39542: calling self._execute() 11661 1726882401.39644: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882401.39655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882401.39671: variable 'omit' from source: magic vars 11661 1726882401.40056: variable 'ansible_distribution_major_version' from source: facts 11661 1726882401.40074: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882401.40183: variable 'network_state' from source: role '' defaults 11661 1726882401.40197: Evaluated conditional (network_state != {}): False 11661 1726882401.40204: when evaluation is False, skipping this task 11661 1726882401.40209: _execute() done 11661 1726882401.40214: dumping result to json 11661 1726882401.40224: done dumping result, returning 11661 1726882401.40234: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-896b-2321-000000000090] 11661 1726882401.40245: sending task result for task 0e448fcc-3ce9-896b-2321-000000000090 skipping: [managed_node2] => { "false_condition": "network_state != {}" } 11661 1726882401.40420: no more pending results, returning what we have 11661 1726882401.40424: results queue empty 11661 1726882401.40425: checking for any_errors_fatal 11661 1726882401.40434: done checking for any_errors_fatal 11661 1726882401.40434: checking for max_fail_percentage 11661 1726882401.40437: done checking for max_fail_percentage 11661 1726882401.40438: checking to see if all hosts have failed and the running result is not ok 11661 1726882401.40439: done checking to see if all hosts have failed 11661 1726882401.40440: getting the remaining hosts for this loop 11661 1726882401.40441: done getting the remaining hosts for this loop 11661 1726882401.40446: getting the next task for host managed_node2 11661 1726882401.40454: done getting next task for host managed_node2 11661 1726882401.40458: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11661 1726882401.40465: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882401.40488: getting variables 11661 1726882401.40490: in VariableManager get_vars() 11661 1726882401.40535: Calling all_inventory to load vars for managed_node2 11661 1726882401.40538: Calling groups_inventory to load vars for managed_node2 11661 1726882401.40541: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882401.40554: Calling all_plugins_play to load vars for managed_node2 11661 1726882401.40557: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882401.40561: Calling groups_plugins_play to load vars for managed_node2 11661 1726882401.41827: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000090 11661 1726882401.41831: WORKER PROCESS EXITING 11661 1726882401.42451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882401.44118: done with get_vars() 11661 1726882401.44149: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:33:21 -0400 (0:00:00.055) 0:00:30.155 ****** 11661 1726882401.44250: entering _queue_task() for managed_node2/ping 11661 1726882401.44566: worker is 1 (out of 1 available) 11661 1726882401.44580: exiting _queue_task() for managed_node2/ping 11661 1726882401.44591: done queuing things up, now waiting for results queue to drain 11661 1726882401.44593: waiting for pending results... 11661 1726882401.44882: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 11661 1726882401.45019: in run() - task 0e448fcc-3ce9-896b-2321-000000000091 11661 1726882401.45042: variable 'ansible_search_path' from source: unknown 11661 1726882401.45049: variable 'ansible_search_path' from source: unknown 11661 1726882401.45091: calling self._execute() 11661 1726882401.45197: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882401.45208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882401.45224: variable 'omit' from source: magic vars 11661 1726882401.45595: variable 'ansible_distribution_major_version' from source: facts 11661 1726882401.45611: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882401.45621: variable 'omit' from source: magic vars 11661 1726882401.45691: variable 'omit' from source: magic vars 11661 1726882401.45728: variable 'omit' from source: magic vars 11661 1726882401.45774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882401.45820: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882401.45843: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882401.45866: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882401.45883: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882401.45919: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882401.45926: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882401.45933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882401.46035: Set connection var ansible_connection to ssh 11661 1726882401.46045: Set connection var ansible_pipelining to False 11661 1726882401.46054: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882401.46070: Set connection var ansible_timeout to 10 11661 1726882401.46078: Set connection var ansible_shell_type to sh 11661 1726882401.46090: Set connection var ansible_shell_executable to /bin/sh 11661 1726882401.46113: variable 'ansible_shell_executable' from source: unknown 11661 1726882401.46126: variable 'ansible_connection' from source: unknown 11661 1726882401.46134: variable 'ansible_module_compression' from source: unknown 11661 1726882401.46140: variable 'ansible_shell_type' from source: unknown 11661 1726882401.46145: variable 'ansible_shell_executable' from source: unknown 11661 1726882401.46151: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882401.46158: variable 'ansible_pipelining' from source: unknown 11661 1726882401.46166: variable 'ansible_timeout' from source: unknown 11661 1726882401.46174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882401.46379: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11661 1726882401.46396: variable 'omit' from source: magic vars 11661 1726882401.46405: starting attempt loop 11661 1726882401.46412: running the handler 11661 1726882401.46429: _low_level_execute_command(): starting 11661 1726882401.46443: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882401.47201: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882401.47221: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882401.47236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882401.47255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882401.47301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882401.47314: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882401.47333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882401.47352: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882401.47368: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882401.47382: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882401.47395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882401.47409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882401.47425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882401.47441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882401.47453: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882401.47470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882401.47548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882401.47577: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882401.47594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882401.47735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882401.49415: stdout chunk (state=3): >>>/root <<< 11661 1726882401.49581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882401.49605: stderr chunk (state=3): >>><<< 11661 1726882401.49628: stdout chunk (state=3): >>><<< 11661 1726882401.49747: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882401.49751: _low_level_execute_command(): starting 11661 1726882401.49755: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882401.496493-13020-58306955953862 `" && echo ansible-tmp-1726882401.496493-13020-58306955953862="` echo /root/.ansible/tmp/ansible-tmp-1726882401.496493-13020-58306955953862 `" ) && sleep 0' 11661 1726882401.50352: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882401.50368: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882401.50383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882401.50403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882401.50449: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882401.50461: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882401.50477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882401.50493: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882401.50504: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882401.50514: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882401.50526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882401.50540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882401.50554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882401.50567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882401.50578: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882401.50589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882401.50668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882401.50693: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882401.50709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882401.50854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882401.52735: stdout chunk (state=3): >>>ansible-tmp-1726882401.496493-13020-58306955953862=/root/.ansible/tmp/ansible-tmp-1726882401.496493-13020-58306955953862 <<< 11661 1726882401.52943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882401.52946: stdout chunk (state=3): >>><<< 11661 1726882401.52949: stderr chunk (state=3): >>><<< 11661 1726882401.53073: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882401.496493-13020-58306955953862=/root/.ansible/tmp/ansible-tmp-1726882401.496493-13020-58306955953862 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882401.53078: variable 'ansible_module_compression' from source: unknown 11661 1726882401.53080: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 11661 1726882401.53181: variable 'ansible_facts' from source: unknown 11661 1726882401.53198: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882401.496493-13020-58306955953862/AnsiballZ_ping.py 11661 1726882401.53351: Sending initial data 11661 1726882401.53355: Sent initial data (151 bytes) 11661 1726882401.54345: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882401.54360: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882401.54378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882401.54395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882401.54436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882401.54447: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882401.54460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882401.54482: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882401.54493: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882401.54503: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882401.54514: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882401.54526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882401.54540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882401.54550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882401.54561: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882401.54575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882401.54654: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882401.54679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882401.54696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882401.54829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882401.56630: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882401.56722: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882401.56820: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmp4f6s5qyc /root/.ansible/tmp/ansible-tmp-1726882401.496493-13020-58306955953862/AnsiballZ_ping.py <<< 11661 1726882401.56911: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882401.58590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882401.58822: stderr chunk (state=3): >>><<< 11661 1726882401.58826: stdout chunk (state=3): >>><<< 11661 1726882401.58828: done transferring module to remote 11661 1726882401.58835: _low_level_execute_command(): starting 11661 1726882401.58838: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882401.496493-13020-58306955953862/ /root/.ansible/tmp/ansible-tmp-1726882401.496493-13020-58306955953862/AnsiballZ_ping.py && sleep 0' 11661 1726882401.59534: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882401.59538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882401.59568: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882401.59573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882401.59586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882401.59644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882401.59879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882401.60213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882401.61989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882401.62066: stderr chunk (state=3): >>><<< 11661 1726882401.62070: stdout chunk (state=3): >>><<< 11661 1726882401.62174: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882401.62177: _low_level_execute_command(): starting 11661 1726882401.62180: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882401.496493-13020-58306955953862/AnsiballZ_ping.py && sleep 0' 11661 1726882401.63614: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882401.63618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882401.63654: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882401.63657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882401.63659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882401.64442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882401.64446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882401.64559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882401.77576: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11661 1726882401.78528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882401.78607: stderr chunk (state=3): >>><<< 11661 1726882401.78611: stdout chunk (state=3): >>><<< 11661 1726882401.78744: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882401.78748: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882401.496493-13020-58306955953862/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882401.78753: _low_level_execute_command(): starting 11661 1726882401.78756: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882401.496493-13020-58306955953862/ > /dev/null 2>&1 && sleep 0' 11661 1726882401.80326: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882401.80340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882401.80358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882401.80381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882401.80428: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882401.80490: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882401.80504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882401.80521: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882401.80533: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882401.80543: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882401.80558: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882401.80574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882401.80594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882401.80606: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882401.80617: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882401.80630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882401.80824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882401.80848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882401.80869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882401.81003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882401.82921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882401.82925: stdout chunk (state=3): >>><<< 11661 1726882401.82927: stderr chunk (state=3): >>><<< 11661 1726882401.83173: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882401.83177: handler run complete 11661 1726882401.83180: attempt loop complete, returning result 11661 1726882401.83182: _execute() done 11661 1726882401.83185: dumping result to json 11661 1726882401.83187: done dumping result, returning 11661 1726882401.83189: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-896b-2321-000000000091] 11661 1726882401.83191: sending task result for task 0e448fcc-3ce9-896b-2321-000000000091 11661 1726882401.83262: done sending task result for task 0e448fcc-3ce9-896b-2321-000000000091 11661 1726882401.83269: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 11661 1726882401.83338: no more pending results, returning what we have 11661 1726882401.83342: results queue empty 11661 1726882401.83343: checking for any_errors_fatal 11661 1726882401.83354: done checking for any_errors_fatal 11661 1726882401.83355: checking for max_fail_percentage 11661 1726882401.83357: done checking for max_fail_percentage 11661 1726882401.83358: checking to see if all hosts have failed and the running result is not ok 11661 1726882401.83359: done checking to see if all hosts have failed 11661 1726882401.83360: getting the remaining hosts for this loop 11661 1726882401.83362: done getting the remaining hosts for this loop 11661 1726882401.83370: getting the next task for host managed_node2 11661 1726882401.83381: done getting next task for host managed_node2 11661 1726882401.83384: ^ task is: TASK: meta (role_complete) 11661 1726882401.83388: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882401.83400: getting variables 11661 1726882401.83403: in VariableManager get_vars() 11661 1726882401.83447: Calling all_inventory to load vars for managed_node2 11661 1726882401.83450: Calling groups_inventory to load vars for managed_node2 11661 1726882401.83455: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882401.83469: Calling all_plugins_play to load vars for managed_node2 11661 1726882401.83472: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882401.83476: Calling groups_plugins_play to load vars for managed_node2 11661 1726882401.86401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882401.91291: done with get_vars() 11661 1726882401.91321: done getting variables 11661 1726882401.91644: done queuing things up, now waiting for results queue to drain 11661 1726882401.91647: results queue empty 11661 1726882401.91648: checking for any_errors_fatal 11661 1726882401.91650: done checking for any_errors_fatal 11661 1726882401.91651: checking for max_fail_percentage 11661 1726882401.91652: done checking for max_fail_percentage 11661 1726882401.91653: checking to see if all hosts have failed and the running result is not ok 11661 1726882401.91654: done checking to see if all hosts have failed 11661 1726882401.91654: getting the remaining hosts for this loop 11661 1726882401.91655: done getting the remaining hosts for this loop 11661 1726882401.91660: getting the next task for host managed_node2 11661 1726882401.91969: done getting next task for host managed_node2 11661 1726882401.91972: ^ task is: TASK: Delete the device '{{ controller_device }}' 11661 1726882401.91975: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882401.91978: getting variables 11661 1726882401.91979: in VariableManager get_vars() 11661 1726882401.91996: Calling all_inventory to load vars for managed_node2 11661 1726882401.92003: Calling groups_inventory to load vars for managed_node2 11661 1726882401.92005: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882401.92011: Calling all_plugins_play to load vars for managed_node2 11661 1726882401.92013: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882401.92016: Calling groups_plugins_play to load vars for managed_node2 11661 1726882401.94579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882401.98323: done with get_vars() 11661 1726882401.98356: done getting variables 11661 1726882401.98403: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11661 1726882401.98528: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:114 Friday 20 September 2024 21:33:21 -0400 (0:00:00.543) 0:00:30.699 ****** 11661 1726882401.98559: entering _queue_task() for managed_node2/command 11661 1726882401.99186: worker is 1 (out of 1 available) 11661 1726882401.99200: exiting _queue_task() for managed_node2/command 11661 1726882401.99212: done queuing things up, now waiting for results queue to drain 11661 1726882401.99214: waiting for pending results... 11661 1726882402.00257: running TaskExecutor() for managed_node2/TASK: Delete the device 'nm-bond' 11661 1726882402.00382: in run() - task 0e448fcc-3ce9-896b-2321-0000000000c1 11661 1726882402.00396: variable 'ansible_search_path' from source: unknown 11661 1726882402.00440: calling self._execute() 11661 1726882402.00548: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882402.01178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882402.01196: variable 'omit' from source: magic vars 11661 1726882402.01580: variable 'ansible_distribution_major_version' from source: facts 11661 1726882402.01601: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882402.01613: variable 'omit' from source: magic vars 11661 1726882402.01638: variable 'omit' from source: magic vars 11661 1726882402.01739: variable 'controller_device' from source: play vars 11661 1726882402.01767: variable 'omit' from source: magic vars 11661 1726882402.01816: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882402.01858: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882402.02314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882402.02336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882402.02356: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882402.02394: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882402.02405: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882402.02413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882402.02517: Set connection var ansible_connection to ssh 11661 1726882402.02530: Set connection var ansible_pipelining to False 11661 1726882402.02541: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882402.02556: Set connection var ansible_timeout to 10 11661 1726882402.02567: Set connection var ansible_shell_type to sh 11661 1726882402.02581: Set connection var ansible_shell_executable to /bin/sh 11661 1726882402.02609: variable 'ansible_shell_executable' from source: unknown 11661 1726882402.02616: variable 'ansible_connection' from source: unknown 11661 1726882402.02623: variable 'ansible_module_compression' from source: unknown 11661 1726882402.02629: variable 'ansible_shell_type' from source: unknown 11661 1726882402.02635: variable 'ansible_shell_executable' from source: unknown 11661 1726882402.02641: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882402.02647: variable 'ansible_pipelining' from source: unknown 11661 1726882402.02656: variable 'ansible_timeout' from source: unknown 11661 1726882402.02668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882402.02812: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882402.02832: variable 'omit' from source: magic vars 11661 1726882402.02841: starting attempt loop 11661 1726882402.02847: running the handler 11661 1726882402.02872: _low_level_execute_command(): starting 11661 1726882402.02884: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882402.03688: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882402.03705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882402.03720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.03740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.03792: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.03807: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882402.03820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.03838: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882402.03849: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882402.03866: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882402.03881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882402.03895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.03916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.03928: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.03939: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882402.03955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.04037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882402.04067: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882402.04086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882402.04223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882402.05889: stdout chunk (state=3): >>>/root <<< 11661 1726882402.06069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882402.06097: stdout chunk (state=3): >>><<< 11661 1726882402.06101: stderr chunk (state=3): >>><<< 11661 1726882402.06112: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882402.06127: _low_level_execute_command(): starting 11661 1726882402.06135: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882402.0611324-13049-260695454710701 `" && echo ansible-tmp-1726882402.0611324-13049-260695454710701="` echo /root/.ansible/tmp/ansible-tmp-1726882402.0611324-13049-260695454710701 `" ) && sleep 0' 11661 1726882402.07712: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882402.07721: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882402.07732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.07747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.07795: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.07802: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882402.07812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.07826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882402.07979: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882402.07985: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882402.07993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882402.08003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.08014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.08022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.08028: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882402.08038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.08116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882402.08135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882402.08148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882402.08286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882402.10189: stdout chunk (state=3): >>>ansible-tmp-1726882402.0611324-13049-260695454710701=/root/.ansible/tmp/ansible-tmp-1726882402.0611324-13049-260695454710701 <<< 11661 1726882402.10381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882402.10397: stderr chunk (state=3): >>><<< 11661 1726882402.10400: stdout chunk (state=3): >>><<< 11661 1726882402.10570: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882402.0611324-13049-260695454710701=/root/.ansible/tmp/ansible-tmp-1726882402.0611324-13049-260695454710701 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882402.10575: variable 'ansible_module_compression' from source: unknown 11661 1726882402.10577: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11661 1726882402.10579: variable 'ansible_facts' from source: unknown 11661 1726882402.10647: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882402.0611324-13049-260695454710701/AnsiballZ_command.py 11661 1726882402.11249: Sending initial data 11661 1726882402.11255: Sent initial data (156 bytes) 11661 1726882402.13648: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882402.13786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882402.13802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.13820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.13869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.13882: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882402.13895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.13913: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882402.13925: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882402.13935: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882402.13947: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882402.13966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.13984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.13998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.14013: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882402.14021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.14096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882402.14188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882402.14200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882402.14330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882402.16131: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882402.16226: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882402.16329: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmp9l3s7jw7 /root/.ansible/tmp/ansible-tmp-1726882402.0611324-13049-260695454710701/AnsiballZ_command.py <<< 11661 1726882402.16426: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882402.17880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882402.17970: stderr chunk (state=3): >>><<< 11661 1726882402.17974: stdout chunk (state=3): >>><<< 11661 1726882402.18072: done transferring module to remote 11661 1726882402.18079: _low_level_execute_command(): starting 11661 1726882402.18081: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882402.0611324-13049-260695454710701/ /root/.ansible/tmp/ansible-tmp-1726882402.0611324-13049-260695454710701/AnsiballZ_command.py && sleep 0' 11661 1726882402.19691: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.19696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.19912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.19917: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.19920: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.19923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 11661 1726882402.19925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.19986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882402.20010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882402.20026: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882402.20174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882402.21987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882402.22109: stderr chunk (state=3): >>><<< 11661 1726882402.22112: stdout chunk (state=3): >>><<< 11661 1726882402.22141: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882402.22144: _low_level_execute_command(): starting 11661 1726882402.22149: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882402.0611324-13049-260695454710701/AnsiballZ_command.py && sleep 0' 11661 1726882402.22924: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882402.22985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882402.23082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.23097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.23135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.23143: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882402.23155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.23168: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882402.23177: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882402.23187: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882402.23195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882402.23204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.23216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.23223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.23230: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882402.23239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.23410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882402.23429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882402.23442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882402.23580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882402.37298: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 21:33:22.363308", "end": "2024-09-20 21:33:22.370446", "delta": "0:00:00.007138", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11661 1726882402.38500: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. <<< 11661 1726882402.38505: stdout chunk (state=3): >>><<< 11661 1726882402.38507: stderr chunk (state=3): >>><<< 11661 1726882402.38665: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 21:33:22.363308", "end": "2024-09-20 21:33:22.370446", "delta": "0:00:00.007138", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. 11661 1726882402.38676: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882402.0611324-13049-260695454710701/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882402.38679: _low_level_execute_command(): starting 11661 1726882402.38682: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882402.0611324-13049-260695454710701/ > /dev/null 2>&1 && sleep 0' 11661 1726882402.39315: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882402.39329: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882402.39350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.39373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.39417: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.39430: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882402.39450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.39474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882402.39487: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882402.39499: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882402.39512: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882402.39526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.39542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.39563: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.39578: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882402.39593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.39678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882402.39702: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882402.39719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882402.39855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882402.41687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882402.41790: stderr chunk (state=3): >>><<< 11661 1726882402.41802: stdout chunk (state=3): >>><<< 11661 1726882402.41974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882402.41978: handler run complete 11661 1726882402.41980: Evaluated conditional (False): False 11661 1726882402.41982: Evaluated conditional (False): False 11661 1726882402.41984: attempt loop complete, returning result 11661 1726882402.41987: _execute() done 11661 1726882402.41989: dumping result to json 11661 1726882402.41991: done dumping result, returning 11661 1726882402.41993: done running TaskExecutor() for managed_node2/TASK: Delete the device 'nm-bond' [0e448fcc-3ce9-896b-2321-0000000000c1] 11661 1726882402.41995: sending task result for task 0e448fcc-3ce9-896b-2321-0000000000c1 11661 1726882402.42075: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000000c1 11661 1726882402.42079: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007138", "end": "2024-09-20 21:33:22.370446", "failed_when_result": false, "rc": 1, "start": "2024-09-20 21:33:22.363308" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 11661 1726882402.42155: no more pending results, returning what we have 11661 1726882402.42158: results queue empty 11661 1726882402.42159: checking for any_errors_fatal 11661 1726882402.42162: done checking for any_errors_fatal 11661 1726882402.42162: checking for max_fail_percentage 11661 1726882402.42166: done checking for max_fail_percentage 11661 1726882402.42168: checking to see if all hosts have failed and the running result is not ok 11661 1726882402.42169: done checking to see if all hosts have failed 11661 1726882402.42169: getting the remaining hosts for this loop 11661 1726882402.42171: done getting the remaining hosts for this loop 11661 1726882402.42175: getting the next task for host managed_node2 11661 1726882402.42184: done getting next task for host managed_node2 11661 1726882402.42188: ^ task is: TASK: Remove test interfaces 11661 1726882402.42191: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882402.42197: getting variables 11661 1726882402.42199: in VariableManager get_vars() 11661 1726882402.42239: Calling all_inventory to load vars for managed_node2 11661 1726882402.42242: Calling groups_inventory to load vars for managed_node2 11661 1726882402.42245: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882402.42261: Calling all_plugins_play to load vars for managed_node2 11661 1726882402.42267: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882402.42271: Calling groups_plugins_play to load vars for managed_node2 11661 1726882402.44275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882402.46071: done with get_vars() 11661 1726882402.46097: done getting variables 11661 1726882402.46158: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:33:22 -0400 (0:00:00.476) 0:00:31.175 ****** 11661 1726882402.46198: entering _queue_task() for managed_node2/shell 11661 1726882402.46538: worker is 1 (out of 1 available) 11661 1726882402.46555: exiting _queue_task() for managed_node2/shell 11661 1726882402.46569: done queuing things up, now waiting for results queue to drain 11661 1726882402.46571: waiting for pending results... 11661 1726882402.46875: running TaskExecutor() for managed_node2/TASK: Remove test interfaces 11661 1726882402.47020: in run() - task 0e448fcc-3ce9-896b-2321-0000000000c5 11661 1726882402.47039: variable 'ansible_search_path' from source: unknown 11661 1726882402.47053: variable 'ansible_search_path' from source: unknown 11661 1726882402.47096: calling self._execute() 11661 1726882402.47200: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882402.47212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882402.47233: variable 'omit' from source: magic vars 11661 1726882402.47649: variable 'ansible_distribution_major_version' from source: facts 11661 1726882402.47677: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882402.47689: variable 'omit' from source: magic vars 11661 1726882402.47765: variable 'omit' from source: magic vars 11661 1726882402.47956: variable 'dhcp_interface1' from source: play vars 11661 1726882402.47970: variable 'dhcp_interface2' from source: play vars 11661 1726882402.48001: variable 'omit' from source: magic vars 11661 1726882402.48057: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882402.48104: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882402.48129: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882402.48160: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882402.48179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882402.48218: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882402.48227: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882402.48236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882402.48358: Set connection var ansible_connection to ssh 11661 1726882402.48376: Set connection var ansible_pipelining to False 11661 1726882402.48388: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882402.48401: Set connection var ansible_timeout to 10 11661 1726882402.48408: Set connection var ansible_shell_type to sh 11661 1726882402.48426: Set connection var ansible_shell_executable to /bin/sh 11661 1726882402.48458: variable 'ansible_shell_executable' from source: unknown 11661 1726882402.48472: variable 'ansible_connection' from source: unknown 11661 1726882402.48479: variable 'ansible_module_compression' from source: unknown 11661 1726882402.48486: variable 'ansible_shell_type' from source: unknown 11661 1726882402.48493: variable 'ansible_shell_executable' from source: unknown 11661 1726882402.48499: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882402.48509: variable 'ansible_pipelining' from source: unknown 11661 1726882402.48516: variable 'ansible_timeout' from source: unknown 11661 1726882402.48527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882402.48708: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882402.48726: variable 'omit' from source: magic vars 11661 1726882402.48735: starting attempt loop 11661 1726882402.48742: running the handler 11661 1726882402.48778: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882402.48810: _low_level_execute_command(): starting 11661 1726882402.48822: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882402.49681: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882402.49697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882402.49712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.49740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.49790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.49803: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882402.49817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.49839: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882402.49858: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882402.49874: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882402.49888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882402.49902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.49917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.49929: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.49944: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882402.49966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.50042: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882402.50078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882402.50095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882402.50227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882402.51880: stdout chunk (state=3): >>>/root <<< 11661 1726882402.52055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882402.52059: stderr chunk (state=3): >>><<< 11661 1726882402.52062: stdout chunk (state=3): >>><<< 11661 1726882402.52096: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882402.52108: _low_level_execute_command(): starting 11661 1726882402.52116: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882402.52095-13065-271769707748012 `" && echo ansible-tmp-1726882402.52095-13065-271769707748012="` echo /root/.ansible/tmp/ansible-tmp-1726882402.52095-13065-271769707748012 `" ) && sleep 0' 11661 1726882402.52828: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882402.53382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882402.53392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.53406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.53446: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.53456: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882402.53466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.53481: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882402.53488: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882402.53496: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882402.53502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882402.53512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.53523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.53530: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.53536: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882402.53545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.53620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882402.53640: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882402.53656: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882402.53890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882402.55700: stdout chunk (state=3): >>>ansible-tmp-1726882402.52095-13065-271769707748012=/root/.ansible/tmp/ansible-tmp-1726882402.52095-13065-271769707748012 <<< 11661 1726882402.55883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882402.55901: stderr chunk (state=3): >>><<< 11661 1726882402.55905: stdout chunk (state=3): >>><<< 11661 1726882402.55926: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882402.52095-13065-271769707748012=/root/.ansible/tmp/ansible-tmp-1726882402.52095-13065-271769707748012 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882402.55960: variable 'ansible_module_compression' from source: unknown 11661 1726882402.56018: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11661 1726882402.56057: variable 'ansible_facts' from source: unknown 11661 1726882402.56136: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882402.52095-13065-271769707748012/AnsiballZ_command.py 11661 1726882402.56690: Sending initial data 11661 1726882402.56693: Sent initial data (154 bytes) 11661 1726882402.58496: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882402.58506: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882402.58517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.58532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.58599: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.58606: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882402.58617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.58631: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882402.58639: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882402.58645: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882402.58656: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882402.58663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.58678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.58685: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.58693: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882402.58706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.58778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882402.58794: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882402.58805: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882402.58938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882402.60710: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882402.60801: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882402.60900: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpww441hs8 /root/.ansible/tmp/ansible-tmp-1726882402.52095-13065-271769707748012/AnsiballZ_command.py <<< 11661 1726882402.60996: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882402.62431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882402.62567: stderr chunk (state=3): >>><<< 11661 1726882402.62571: stdout chunk (state=3): >>><<< 11661 1726882402.62573: done transferring module to remote 11661 1726882402.62575: _low_level_execute_command(): starting 11661 1726882402.62577: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882402.52095-13065-271769707748012/ /root/.ansible/tmp/ansible-tmp-1726882402.52095-13065-271769707748012/AnsiballZ_command.py && sleep 0' 11661 1726882402.63158: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882402.63174: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882402.63188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.63203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.63242: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.63255: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882402.63272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.63288: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882402.63298: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882402.63307: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882402.63317: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882402.63328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.63341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.63354: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.63370: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882402.63385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.63478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882402.63495: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882402.63541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882402.63791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882402.65585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882402.65654: stderr chunk (state=3): >>><<< 11661 1726882402.65658: stdout chunk (state=3): >>><<< 11661 1726882402.65756: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882402.65761: _low_level_execute_command(): starting 11661 1726882402.65765: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882402.52095-13065-271769707748012/AnsiballZ_command.py && sleep 0' 11661 1726882402.66819: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882402.66984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882402.66999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.67017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.67064: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.67077: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882402.67090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.67107: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882402.67118: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882402.67128: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882402.67138: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882402.67150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.67171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.67182: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.67191: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882402.67204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.67280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882402.67382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882402.67398: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882402.67720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882402.86840: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:33:22.807783", "end": "2024-09-20 21:33:22.865649", "delta": "0:00:00.057866", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11661 1726882402.88301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882402.88348: stderr chunk (state=3): >>><<< 11661 1726882402.88354: stdout chunk (state=3): >>><<< 11661 1726882402.88499: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:33:22.807783", "end": "2024-09-20 21:33:22.865649", "delta": "0:00:00.057866", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882402.88503: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882402.52095-13065-271769707748012/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882402.88506: _low_level_execute_command(): starting 11661 1726882402.88508: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882402.52095-13065-271769707748012/ > /dev/null 2>&1 && sleep 0' 11661 1726882402.89184: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882402.89188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.89215: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882402.89218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 11661 1726882402.89220: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.89222: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.89274: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882402.89291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882402.89385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882402.91283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882402.91305: stderr chunk (state=3): >>><<< 11661 1726882402.91320: stdout chunk (state=3): >>><<< 11661 1726882402.91370: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882402.91376: handler run complete 11661 1726882402.91573: Evaluated conditional (False): False 11661 1726882402.91576: attempt loop complete, returning result 11661 1726882402.91578: _execute() done 11661 1726882402.91580: dumping result to json 11661 1726882402.91582: done dumping result, returning 11661 1726882402.91584: done running TaskExecutor() for managed_node2/TASK: Remove test interfaces [0e448fcc-3ce9-896b-2321-0000000000c5] 11661 1726882402.91586: sending task result for task 0e448fcc-3ce9-896b-2321-0000000000c5 11661 1726882402.91659: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000000c5 11661 1726882402.91662: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.057866", "end": "2024-09-20 21:33:22.865649", "rc": 0, "start": "2024-09-20 21:33:22.807783" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 11661 1726882402.91750: no more pending results, returning what we have 11661 1726882402.91756: results queue empty 11661 1726882402.91757: checking for any_errors_fatal 11661 1726882402.91775: done checking for any_errors_fatal 11661 1726882402.91776: checking for max_fail_percentage 11661 1726882402.91779: done checking for max_fail_percentage 11661 1726882402.91780: checking to see if all hosts have failed and the running result is not ok 11661 1726882402.91781: done checking to see if all hosts have failed 11661 1726882402.91781: getting the remaining hosts for this loop 11661 1726882402.91784: done getting the remaining hosts for this loop 11661 1726882402.91788: getting the next task for host managed_node2 11661 1726882402.91797: done getting next task for host managed_node2 11661 1726882402.91800: ^ task is: TASK: Stop dnsmasq/radvd services 11661 1726882402.91804: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882402.91809: getting variables 11661 1726882402.91811: in VariableManager get_vars() 11661 1726882402.91856: Calling all_inventory to load vars for managed_node2 11661 1726882402.91859: Calling groups_inventory to load vars for managed_node2 11661 1726882402.91862: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882402.91880: Calling all_plugins_play to load vars for managed_node2 11661 1726882402.91884: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882402.91888: Calling groups_plugins_play to load vars for managed_node2 11661 1726882402.93047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882402.93994: done with get_vars() 11661 1726882402.94012: done getting variables 11661 1726882402.94059: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 21:33:22 -0400 (0:00:00.478) 0:00:31.654 ****** 11661 1726882402.94084: entering _queue_task() for managed_node2/shell 11661 1726882402.94312: worker is 1 (out of 1 available) 11661 1726882402.94327: exiting _queue_task() for managed_node2/shell 11661 1726882402.94339: done queuing things up, now waiting for results queue to drain 11661 1726882402.94341: waiting for pending results... 11661 1726882402.94525: running TaskExecutor() for managed_node2/TASK: Stop dnsmasq/radvd services 11661 1726882402.94615: in run() - task 0e448fcc-3ce9-896b-2321-0000000000c6 11661 1726882402.94626: variable 'ansible_search_path' from source: unknown 11661 1726882402.94629: variable 'ansible_search_path' from source: unknown 11661 1726882402.94661: calling self._execute() 11661 1726882402.94740: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882402.94744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882402.94755: variable 'omit' from source: magic vars 11661 1726882402.95043: variable 'ansible_distribution_major_version' from source: facts 11661 1726882402.95052: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882402.95062: variable 'omit' from source: magic vars 11661 1726882402.95099: variable 'omit' from source: magic vars 11661 1726882402.95129: variable 'omit' from source: magic vars 11661 1726882402.95162: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882402.95190: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882402.95207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882402.95222: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882402.95233: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882402.95258: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882402.95262: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882402.95266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882402.95339: Set connection var ansible_connection to ssh 11661 1726882402.95344: Set connection var ansible_pipelining to False 11661 1726882402.95349: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882402.95359: Set connection var ansible_timeout to 10 11661 1726882402.95362: Set connection var ansible_shell_type to sh 11661 1726882402.95370: Set connection var ansible_shell_executable to /bin/sh 11661 1726882402.95387: variable 'ansible_shell_executable' from source: unknown 11661 1726882402.95390: variable 'ansible_connection' from source: unknown 11661 1726882402.95392: variable 'ansible_module_compression' from source: unknown 11661 1726882402.95395: variable 'ansible_shell_type' from source: unknown 11661 1726882402.95397: variable 'ansible_shell_executable' from source: unknown 11661 1726882402.95399: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882402.95401: variable 'ansible_pipelining' from source: unknown 11661 1726882402.95405: variable 'ansible_timeout' from source: unknown 11661 1726882402.95409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882402.95515: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882402.95524: variable 'omit' from source: magic vars 11661 1726882402.95528: starting attempt loop 11661 1726882402.95531: running the handler 11661 1726882402.95542: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882402.95568: _low_level_execute_command(): starting 11661 1726882402.95574: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882402.96106: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.96131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882402.96147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.96234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882402.96332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882402.97958: stdout chunk (state=3): >>>/root <<< 11661 1726882402.98059: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882402.98123: stderr chunk (state=3): >>><<< 11661 1726882402.98125: stdout chunk (state=3): >>><<< 11661 1726882402.98158: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882402.98175: _low_level_execute_command(): starting 11661 1726882402.98182: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882402.9816-13089-79670729437398 `" && echo ansible-tmp-1726882402.9816-13089-79670729437398="` echo /root/.ansible/tmp/ansible-tmp-1726882402.9816-13089-79670729437398 `" ) && sleep 0' 11661 1726882402.98647: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882402.98654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882402.98691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.98703: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 11661 1726882402.98706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882402.98708: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882402.98768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882402.98771: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882402.98773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882402.98869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882403.00736: stdout chunk (state=3): >>>ansible-tmp-1726882402.9816-13089-79670729437398=/root/.ansible/tmp/ansible-tmp-1726882402.9816-13089-79670729437398 <<< 11661 1726882403.00847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882403.00937: stderr chunk (state=3): >>><<< 11661 1726882403.00947: stdout chunk (state=3): >>><<< 11661 1726882403.01298: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882402.9816-13089-79670729437398=/root/.ansible/tmp/ansible-tmp-1726882402.9816-13089-79670729437398 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882403.01302: variable 'ansible_module_compression' from source: unknown 11661 1726882403.01308: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11661 1726882403.01312: variable 'ansible_facts' from source: unknown 11661 1726882403.01314: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882402.9816-13089-79670729437398/AnsiballZ_command.py 11661 1726882403.01386: Sending initial data 11661 1726882403.01388: Sent initial data (152 bytes) 11661 1726882403.02904: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882403.02921: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882403.02935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882403.02951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882403.02995: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882403.03009: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882403.03022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882403.03039: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882403.03050: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882403.03061: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882403.03082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882403.03095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882403.03111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882403.03122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882403.03133: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882403.03145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882403.03857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882403.03882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882403.03898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882403.04022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882403.05925: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882403.06032: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882403.06137: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpemd7vnl7 /root/.ansible/tmp/ansible-tmp-1726882402.9816-13089-79670729437398/AnsiballZ_command.py <<< 11661 1726882403.06233: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882403.07702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882403.07916: stderr chunk (state=3): >>><<< 11661 1726882403.07920: stdout chunk (state=3): >>><<< 11661 1726882403.07922: done transferring module to remote 11661 1726882403.07924: _low_level_execute_command(): starting 11661 1726882403.07927: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882402.9816-13089-79670729437398/ /root/.ansible/tmp/ansible-tmp-1726882402.9816-13089-79670729437398/AnsiballZ_command.py && sleep 0' 11661 1726882403.08920: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882403.08924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882403.08956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882403.08960: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882403.08970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882403.09032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882403.09035: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882403.09037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882403.09221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882403.10914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882403.11034: stderr chunk (state=3): >>><<< 11661 1726882403.11037: stdout chunk (state=3): >>><<< 11661 1726882403.11172: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882403.11180: _low_level_execute_command(): starting 11661 1726882403.11183: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882402.9816-13089-79670729437398/AnsiballZ_command.py && sleep 0' 11661 1726882403.12162: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882403.12186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882403.12207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882403.12248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882403.12392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882403.12422: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882403.12445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882403.12482: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882403.12513: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882403.12526: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882403.12559: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882403.12597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882403.12639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882403.12660: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882403.12685: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882403.12715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882403.12893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882403.12924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882403.12938: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882403.13090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882403.28329: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:33:23.259614", "end": "2024-09-20 21:33:23.280257", "delta": "0:00:00.020643", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11661 1726882403.29486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882403.29619: stderr chunk (state=3): >>><<< 11661 1726882403.29622: stdout chunk (state=3): >>><<< 11661 1726882403.29671: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:33:23.259614", "end": "2024-09-20 21:33:23.280257", "delta": "0:00:00.020643", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882403.29797: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882402.9816-13089-79670729437398/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882403.29801: _low_level_execute_command(): starting 11661 1726882403.29803: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882402.9816-13089-79670729437398/ > /dev/null 2>&1 && sleep 0' 11661 1726882403.30512: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882403.30541: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882403.30563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882403.30590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882403.30673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882403.30703: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882403.30717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882403.30733: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882403.30744: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882403.30758: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882403.30800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882403.30813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882403.30827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882403.30838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882403.30848: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882403.30863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882403.30973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882403.30995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882403.31080: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882403.31266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882403.33089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882403.33283: stderr chunk (state=3): >>><<< 11661 1726882403.33287: stdout chunk (state=3): >>><<< 11661 1726882403.33335: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882403.33339: handler run complete 11661 1726882403.33415: Evaluated conditional (False): False 11661 1726882403.33418: attempt loop complete, returning result 11661 1726882403.33420: _execute() done 11661 1726882403.33423: dumping result to json 11661 1726882403.33425: done dumping result, returning 11661 1726882403.33628: done running TaskExecutor() for managed_node2/TASK: Stop dnsmasq/radvd services [0e448fcc-3ce9-896b-2321-0000000000c6] 11661 1726882403.33631: sending task result for task 0e448fcc-3ce9-896b-2321-0000000000c6 11661 1726882403.33719: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000000c6 11661 1726882403.33723: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.020643", "end": "2024-09-20 21:33:23.280257", "rc": 0, "start": "2024-09-20 21:33:23.259614" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 11661 1726882403.33835: no more pending results, returning what we have 11661 1726882403.33839: results queue empty 11661 1726882403.33840: checking for any_errors_fatal 11661 1726882403.33856: done checking for any_errors_fatal 11661 1726882403.33857: checking for max_fail_percentage 11661 1726882403.33859: done checking for max_fail_percentage 11661 1726882403.33861: checking to see if all hosts have failed and the running result is not ok 11661 1726882403.33862: done checking to see if all hosts have failed 11661 1726882403.33863: getting the remaining hosts for this loop 11661 1726882403.33899: done getting the remaining hosts for this loop 11661 1726882403.33912: getting the next task for host managed_node2 11661 1726882403.33932: done getting next task for host managed_node2 11661 1726882403.33936: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 11661 1726882403.33939: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882403.33949: getting variables 11661 1726882403.33956: in VariableManager get_vars() 11661 1726882403.34264: Calling all_inventory to load vars for managed_node2 11661 1726882403.34269: Calling groups_inventory to load vars for managed_node2 11661 1726882403.34272: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882403.34284: Calling all_plugins_play to load vars for managed_node2 11661 1726882403.34321: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882403.34328: Calling groups_plugins_play to load vars for managed_node2 11661 1726882403.36450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882403.38305: done with get_vars() 11661 1726882403.38333: done getting variables 11661 1726882403.38403: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:120 Friday 20 September 2024 21:33:23 -0400 (0:00:00.443) 0:00:32.097 ****** 11661 1726882403.38435: entering _queue_task() for managed_node2/command 11661 1726882403.38781: worker is 1 (out of 1 available) 11661 1726882403.38795: exiting _queue_task() for managed_node2/command 11661 1726882403.38809: done queuing things up, now waiting for results queue to drain 11661 1726882403.38811: waiting for pending results... 11661 1726882403.39166: running TaskExecutor() for managed_node2/TASK: Restore the /etc/resolv.conf for initscript 11661 1726882403.39304: in run() - task 0e448fcc-3ce9-896b-2321-0000000000c7 11661 1726882403.39331: variable 'ansible_search_path' from source: unknown 11661 1726882403.39385: calling self._execute() 11661 1726882403.39510: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882403.39522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882403.39537: variable 'omit' from source: magic vars 11661 1726882403.39985: variable 'ansible_distribution_major_version' from source: facts 11661 1726882403.40003: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882403.40165: variable 'network_provider' from source: set_fact 11661 1726882403.40183: Evaluated conditional (network_provider == "initscripts"): False 11661 1726882403.40197: when evaluation is False, skipping this task 11661 1726882403.40211: _execute() done 11661 1726882403.40219: dumping result to json 11661 1726882403.40244: done dumping result, returning 11661 1726882403.40259: done running TaskExecutor() for managed_node2/TASK: Restore the /etc/resolv.conf for initscript [0e448fcc-3ce9-896b-2321-0000000000c7] 11661 1726882403.40274: sending task result for task 0e448fcc-3ce9-896b-2321-0000000000c7 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11661 1726882403.40457: no more pending results, returning what we have 11661 1726882403.40461: results queue empty 11661 1726882403.40462: checking for any_errors_fatal 11661 1726882403.40479: done checking for any_errors_fatal 11661 1726882403.40481: checking for max_fail_percentage 11661 1726882403.40482: done checking for max_fail_percentage 11661 1726882403.40484: checking to see if all hosts have failed and the running result is not ok 11661 1726882403.40485: done checking to see if all hosts have failed 11661 1726882403.40485: getting the remaining hosts for this loop 11661 1726882403.40487: done getting the remaining hosts for this loop 11661 1726882403.40491: getting the next task for host managed_node2 11661 1726882403.40501: done getting next task for host managed_node2 11661 1726882403.40504: ^ task is: TASK: Verify network state restored to default 11661 1726882403.40508: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882403.40514: getting variables 11661 1726882403.40516: in VariableManager get_vars() 11661 1726882403.40562: Calling all_inventory to load vars for managed_node2 11661 1726882403.40567: Calling groups_inventory to load vars for managed_node2 11661 1726882403.40571: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882403.40589: Calling all_plugins_play to load vars for managed_node2 11661 1726882403.40593: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882403.40597: Calling groups_plugins_play to load vars for managed_node2 11661 1726882403.41722: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000000c7 11661 1726882403.41730: WORKER PROCESS EXITING 11661 1726882403.42830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882403.44788: done with get_vars() 11661 1726882403.44819: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:125 Friday 20 September 2024 21:33:23 -0400 (0:00:00.064) 0:00:32.162 ****** 11661 1726882403.44921: entering _queue_task() for managed_node2/include_tasks 11661 1726882403.45259: worker is 1 (out of 1 available) 11661 1726882403.45276: exiting _queue_task() for managed_node2/include_tasks 11661 1726882403.45289: done queuing things up, now waiting for results queue to drain 11661 1726882403.45291: waiting for pending results... 11661 1726882403.45675: running TaskExecutor() for managed_node2/TASK: Verify network state restored to default 11661 1726882403.45820: in run() - task 0e448fcc-3ce9-896b-2321-0000000000c8 11661 1726882403.45846: variable 'ansible_search_path' from source: unknown 11661 1726882403.45897: calling self._execute() 11661 1726882403.46012: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882403.46027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882403.46043: variable 'omit' from source: magic vars 11661 1726882403.46527: variable 'ansible_distribution_major_version' from source: facts 11661 1726882403.46562: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882403.46589: _execute() done 11661 1726882403.46598: dumping result to json 11661 1726882403.46608: done dumping result, returning 11661 1726882403.46620: done running TaskExecutor() for managed_node2/TASK: Verify network state restored to default [0e448fcc-3ce9-896b-2321-0000000000c8] 11661 1726882403.46630: sending task result for task 0e448fcc-3ce9-896b-2321-0000000000c8 11661 1726882403.46810: no more pending results, returning what we have 11661 1726882403.46816: in VariableManager get_vars() 11661 1726882403.46873: Calling all_inventory to load vars for managed_node2 11661 1726882403.46877: Calling groups_inventory to load vars for managed_node2 11661 1726882403.46879: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882403.46894: Calling all_plugins_play to load vars for managed_node2 11661 1726882403.46898: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882403.46903: Calling groups_plugins_play to load vars for managed_node2 11661 1726882403.48116: done sending task result for task 0e448fcc-3ce9-896b-2321-0000000000c8 11661 1726882403.48121: WORKER PROCESS EXITING 11661 1726882403.48892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882403.50728: done with get_vars() 11661 1726882403.50765: variable 'ansible_search_path' from source: unknown 11661 1726882403.50784: we have included files to process 11661 1726882403.50785: generating all_blocks data 11661 1726882403.50788: done generating all_blocks data 11661 1726882403.50794: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11661 1726882403.50795: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11661 1726882403.50797: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11661 1726882403.51213: done processing included file 11661 1726882403.51215: iterating over new_blocks loaded from include file 11661 1726882403.51217: in VariableManager get_vars() 11661 1726882403.51235: done with get_vars() 11661 1726882403.51236: filtering new block on tags 11661 1726882403.51272: done filtering new block on tags 11661 1726882403.51274: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node2 11661 1726882403.51279: extending task lists for all hosts with included blocks 11661 1726882403.52789: done extending task lists 11661 1726882403.52791: done processing included files 11661 1726882403.52792: results queue empty 11661 1726882403.52793: checking for any_errors_fatal 11661 1726882403.52796: done checking for any_errors_fatal 11661 1726882403.52796: checking for max_fail_percentage 11661 1726882403.52798: done checking for max_fail_percentage 11661 1726882403.52798: checking to see if all hosts have failed and the running result is not ok 11661 1726882403.52799: done checking to see if all hosts have failed 11661 1726882403.52800: getting the remaining hosts for this loop 11661 1726882403.52801: done getting the remaining hosts for this loop 11661 1726882403.52804: getting the next task for host managed_node2 11661 1726882403.52808: done getting next task for host managed_node2 11661 1726882403.52811: ^ task is: TASK: Check routes and DNS 11661 1726882403.52813: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882403.52816: getting variables 11661 1726882403.52818: in VariableManager get_vars() 11661 1726882403.52841: Calling all_inventory to load vars for managed_node2 11661 1726882403.52844: Calling groups_inventory to load vars for managed_node2 11661 1726882403.52846: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882403.52856: Calling all_plugins_play to load vars for managed_node2 11661 1726882403.52859: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882403.52862: Calling groups_plugins_play to load vars for managed_node2 11661 1726882403.54331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882403.56322: done with get_vars() 11661 1726882403.56356: done getting variables 11661 1726882403.56407: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:33:23 -0400 (0:00:00.115) 0:00:32.277 ****** 11661 1726882403.56445: entering _queue_task() for managed_node2/shell 11661 1726882403.56806: worker is 1 (out of 1 available) 11661 1726882403.56819: exiting _queue_task() for managed_node2/shell 11661 1726882403.56831: done queuing things up, now waiting for results queue to drain 11661 1726882403.56832: waiting for pending results... 11661 1726882403.57267: running TaskExecutor() for managed_node2/TASK: Check routes and DNS 11661 1726882403.57357: in run() - task 0e448fcc-3ce9-896b-2321-00000000056d 11661 1726882403.57379: variable 'ansible_search_path' from source: unknown 11661 1726882403.57383: variable 'ansible_search_path' from source: unknown 11661 1726882403.57411: calling self._execute() 11661 1726882403.57488: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882403.57494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882403.57503: variable 'omit' from source: magic vars 11661 1726882403.57785: variable 'ansible_distribution_major_version' from source: facts 11661 1726882403.57794: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882403.57800: variable 'omit' from source: magic vars 11661 1726882403.57838: variable 'omit' from source: magic vars 11661 1726882403.57865: variable 'omit' from source: magic vars 11661 1726882403.57899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882403.57927: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882403.57945: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882403.57964: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882403.57975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882403.57997: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882403.58000: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882403.58004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882403.58077: Set connection var ansible_connection to ssh 11661 1726882403.58081: Set connection var ansible_pipelining to False 11661 1726882403.58087: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882403.58094: Set connection var ansible_timeout to 10 11661 1726882403.58096: Set connection var ansible_shell_type to sh 11661 1726882403.58102: Set connection var ansible_shell_executable to /bin/sh 11661 1726882403.58119: variable 'ansible_shell_executable' from source: unknown 11661 1726882403.58121: variable 'ansible_connection' from source: unknown 11661 1726882403.58125: variable 'ansible_module_compression' from source: unknown 11661 1726882403.58127: variable 'ansible_shell_type' from source: unknown 11661 1726882403.58129: variable 'ansible_shell_executable' from source: unknown 11661 1726882403.58131: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882403.58135: variable 'ansible_pipelining' from source: unknown 11661 1726882403.58138: variable 'ansible_timeout' from source: unknown 11661 1726882403.58142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882403.58242: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882403.58251: variable 'omit' from source: magic vars 11661 1726882403.58261: starting attempt loop 11661 1726882403.58265: running the handler 11661 1726882403.58275: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882403.58290: _low_level_execute_command(): starting 11661 1726882403.58298: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882403.58826: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882403.58836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882403.58857: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882403.58873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882403.58884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882403.58927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882403.58942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882403.58952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882403.59077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882403.60737: stdout chunk (state=3): >>>/root <<< 11661 1726882403.60928: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882403.60932: stdout chunk (state=3): >>><<< 11661 1726882403.60934: stderr chunk (state=3): >>><<< 11661 1726882403.60971: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882403.61070: _low_level_execute_command(): starting 11661 1726882403.61074: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882403.609611-13114-264646612963050 `" && echo ansible-tmp-1726882403.609611-13114-264646612963050="` echo /root/.ansible/tmp/ansible-tmp-1726882403.609611-13114-264646612963050 `" ) && sleep 0' 11661 1726882403.61719: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882403.61722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882403.61757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882403.61760: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882403.61762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882403.61834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882403.61855: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882403.62104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882403.63931: stdout chunk (state=3): >>>ansible-tmp-1726882403.609611-13114-264646612963050=/root/.ansible/tmp/ansible-tmp-1726882403.609611-13114-264646612963050 <<< 11661 1726882403.63996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882403.64053: stderr chunk (state=3): >>><<< 11661 1726882403.64056: stdout chunk (state=3): >>><<< 11661 1726882403.64180: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882403.609611-13114-264646612963050=/root/.ansible/tmp/ansible-tmp-1726882403.609611-13114-264646612963050 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882403.64194: variable 'ansible_module_compression' from source: unknown 11661 1726882403.64197: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11661 1726882403.64199: variable 'ansible_facts' from source: unknown 11661 1726882403.64335: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882403.609611-13114-264646612963050/AnsiballZ_command.py 11661 1726882403.64731: Sending initial data 11661 1726882403.64780: Sent initial data (155 bytes) 11661 1726882403.66800: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882403.66820: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882403.66835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882403.66852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882403.66916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882403.66931: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882403.66944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882403.66961: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882403.66974: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882403.66983: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882403.67080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882403.67186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882403.67203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882403.67216: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882403.67228: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882403.67242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882403.67325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882403.67346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882403.67361: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882403.67494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882403.69257: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882403.69341: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882403.69437: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpq7mlf0re /root/.ansible/tmp/ansible-tmp-1726882403.609611-13114-264646612963050/AnsiballZ_command.py <<< 11661 1726882403.69527: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882403.70989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882403.71091: stderr chunk (state=3): >>><<< 11661 1726882403.71095: stdout chunk (state=3): >>><<< 11661 1726882403.71118: done transferring module to remote 11661 1726882403.71129: _low_level_execute_command(): starting 11661 1726882403.71134: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882403.609611-13114-264646612963050/ /root/.ansible/tmp/ansible-tmp-1726882403.609611-13114-264646612963050/AnsiballZ_command.py && sleep 0' 11661 1726882403.71928: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882403.71937: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882403.71947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882403.71966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882403.72016: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882403.72022: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882403.72031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882403.72044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882403.72051: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882403.72060: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882403.72069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882403.72081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882403.72099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882403.72111: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882403.72118: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882403.72132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882403.72203: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882403.72224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882403.72230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882403.72365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882403.74131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882403.74234: stderr chunk (state=3): >>><<< 11661 1726882403.74238: stdout chunk (state=3): >>><<< 11661 1726882403.74268: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882403.74272: _low_level_execute_command(): starting 11661 1726882403.74275: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882403.609611-13114-264646612963050/AnsiballZ_command.py && sleep 0' 11661 1726882403.75430: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882403.75434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882403.75435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882403.75437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882403.75439: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882403.75441: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882403.75442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882403.75444: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882403.75445: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882403.75447: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882403.75448: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882403.75450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882403.75452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882403.75453: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882403.75455: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882403.75462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882403.75466: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882403.75468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882403.75470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882403.75472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882403.89396: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:4f:68:7a:de:b1 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.158/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3460sec preferred_lft 3460sec\n inet6 fe80::104f:68ff:fe7a:deb1/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.158 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.158 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:33:23.882846", "end": "2024-09-20 21:33:23.891475", "delta": "0:00:00.008629", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11661 1726882403.90556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882403.90630: stderr chunk (state=3): >>><<< 11661 1726882403.90634: stdout chunk (state=3): >>><<< 11661 1726882403.90786: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:4f:68:7a:de:b1 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.158/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3460sec preferred_lft 3460sec\n inet6 fe80::104f:68ff:fe7a:deb1/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.158 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.158 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:33:23.882846", "end": "2024-09-20 21:33:23.891475", "delta": "0:00:00.008629", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882403.90796: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882403.609611-13114-264646612963050/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882403.90799: _low_level_execute_command(): starting 11661 1726882403.90802: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882403.609611-13114-264646612963050/ > /dev/null 2>&1 && sleep 0' 11661 1726882403.91376: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882403.91391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882403.91407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882403.91426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882403.91472: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882403.91485: stderr chunk (state=3): >>>debug2: match not found <<< 11661 1726882403.91499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882403.91516: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11661 1726882403.91528: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 11661 1726882403.91539: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11661 1726882403.91552: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882403.91569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882403.91585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882403.91598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 11661 1726882403.91609: stderr chunk (state=3): >>>debug2: match found <<< 11661 1726882403.91622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882403.91696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882403.91719: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882403.91736: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882403.91868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882403.93657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882403.93730: stderr chunk (state=3): >>><<< 11661 1726882403.93734: stdout chunk (state=3): >>><<< 11661 1726882403.93844: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882403.93847: handler run complete 11661 1726882403.93849: Evaluated conditional (False): False 11661 1726882403.93853: attempt loop complete, returning result 11661 1726882403.93856: _execute() done 11661 1726882403.93858: dumping result to json 11661 1726882403.93861: done dumping result, returning 11661 1726882403.93865: done running TaskExecutor() for managed_node2/TASK: Check routes and DNS [0e448fcc-3ce9-896b-2321-00000000056d] 11661 1726882403.93867: sending task result for task 0e448fcc-3ce9-896b-2321-00000000056d 11661 1726882403.93940: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000056d 11661 1726882403.93942: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008629", "end": "2024-09-20 21:33:23.891475", "rc": 0, "start": "2024-09-20 21:33:23.882846" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:4f:68:7a:de:b1 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.11.158/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3460sec preferred_lft 3460sec inet6 fe80::104f:68ff:fe7a:deb1/64 scope link valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.158 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.158 metric 100 IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 11661 1726882403.94233: no more pending results, returning what we have 11661 1726882403.94236: results queue empty 11661 1726882403.94237: checking for any_errors_fatal 11661 1726882403.94239: done checking for any_errors_fatal 11661 1726882403.94239: checking for max_fail_percentage 11661 1726882403.94241: done checking for max_fail_percentage 11661 1726882403.94242: checking to see if all hosts have failed and the running result is not ok 11661 1726882403.94243: done checking to see if all hosts have failed 11661 1726882403.94244: getting the remaining hosts for this loop 11661 1726882403.94245: done getting the remaining hosts for this loop 11661 1726882403.94249: getting the next task for host managed_node2 11661 1726882403.94255: done getting next task for host managed_node2 11661 1726882403.94257: ^ task is: TASK: Verify DNS and network connectivity 11661 1726882403.94262: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11661 1726882403.94273: getting variables 11661 1726882403.94274: in VariableManager get_vars() 11661 1726882403.94311: Calling all_inventory to load vars for managed_node2 11661 1726882403.94313: Calling groups_inventory to load vars for managed_node2 11661 1726882403.94315: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882403.94325: Calling all_plugins_play to load vars for managed_node2 11661 1726882403.94328: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882403.94331: Calling groups_plugins_play to load vars for managed_node2 11661 1726882403.95399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882403.96411: done with get_vars() 11661 1726882403.96427: done getting variables 11661 1726882403.96474: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:33:23 -0400 (0:00:00.400) 0:00:32.678 ****** 11661 1726882403.96499: entering _queue_task() for managed_node2/shell 11661 1726882403.96728: worker is 1 (out of 1 available) 11661 1726882403.96742: exiting _queue_task() for managed_node2/shell 11661 1726882403.96754: done queuing things up, now waiting for results queue to drain 11661 1726882403.96756: waiting for pending results... 11661 1726882403.96951: running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity 11661 1726882403.97086: in run() - task 0e448fcc-3ce9-896b-2321-00000000056e 11661 1726882403.97113: variable 'ansible_search_path' from source: unknown 11661 1726882403.97124: variable 'ansible_search_path' from source: unknown 11661 1726882403.97174: calling self._execute() 11661 1726882403.97296: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882403.97315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882403.97338: variable 'omit' from source: magic vars 11661 1726882403.97801: variable 'ansible_distribution_major_version' from source: facts 11661 1726882403.97829: Evaluated conditional (ansible_distribution_major_version != '6'): True 11661 1726882403.97995: variable 'ansible_facts' from source: unknown 11661 1726882403.98642: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 11661 1726882403.98664: variable 'omit' from source: magic vars 11661 1726882403.98720: variable 'omit' from source: magic vars 11661 1726882403.98748: variable 'omit' from source: magic vars 11661 1726882403.98797: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11661 1726882403.98862: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11661 1726882403.98905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11661 1726882403.98936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882403.98966: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11661 1726882403.99013: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11661 1726882403.99020: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882403.99023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882403.99142: Set connection var ansible_connection to ssh 11661 1726882403.99168: Set connection var ansible_pipelining to False 11661 1726882403.99187: Set connection var ansible_module_compression to ZIP_DEFLATED 11661 1726882403.99206: Set connection var ansible_timeout to 10 11661 1726882403.99215: Set connection var ansible_shell_type to sh 11661 1726882403.99229: Set connection var ansible_shell_executable to /bin/sh 11661 1726882403.99279: variable 'ansible_shell_executable' from source: unknown 11661 1726882403.99294: variable 'ansible_connection' from source: unknown 11661 1726882403.99308: variable 'ansible_module_compression' from source: unknown 11661 1726882403.99316: variable 'ansible_shell_type' from source: unknown 11661 1726882403.99333: variable 'ansible_shell_executable' from source: unknown 11661 1726882403.99348: variable 'ansible_host' from source: host vars for 'managed_node2' 11661 1726882403.99361: variable 'ansible_pipelining' from source: unknown 11661 1726882403.99375: variable 'ansible_timeout' from source: unknown 11661 1726882403.99395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11661 1726882403.99577: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882403.99608: variable 'omit' from source: magic vars 11661 1726882403.99616: starting attempt loop 11661 1726882403.99621: running the handler 11661 1726882403.99631: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11661 1726882403.99649: _low_level_execute_command(): starting 11661 1726882403.99658: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11661 1726882404.00594: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11661 1726882404.00601: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882404.00622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882404.00643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882404.00683: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882404.00705: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882404.00708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882404.00757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882404.00771: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882404.00894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882404.02519: stdout chunk (state=3): >>>/root <<< 11661 1726882404.02616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882404.02680: stderr chunk (state=3): >>><<< 11661 1726882404.02687: stdout chunk (state=3): >>><<< 11661 1726882404.02706: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882404.02718: _low_level_execute_command(): starting 11661 1726882404.02725: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882404.0270607-13143-239969239573914 `" && echo ansible-tmp-1726882404.0270607-13143-239969239573914="` echo /root/.ansible/tmp/ansible-tmp-1726882404.0270607-13143-239969239573914 `" ) && sleep 0' 11661 1726882404.03222: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882404.03234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882404.03255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882404.03271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882404.03283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882404.03328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882404.03340: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882404.03444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882404.05330: stdout chunk (state=3): >>>ansible-tmp-1726882404.0270607-13143-239969239573914=/root/.ansible/tmp/ansible-tmp-1726882404.0270607-13143-239969239573914 <<< 11661 1726882404.05464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882404.05594: stderr chunk (state=3): >>><<< 11661 1726882404.05608: stdout chunk (state=3): >>><<< 11661 1726882404.05634: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882404.0270607-13143-239969239573914=/root/.ansible/tmp/ansible-tmp-1726882404.0270607-13143-239969239573914 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882404.05666: variable 'ansible_module_compression' from source: unknown 11661 1726882404.05716: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11661k_nrvl51/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11661 1726882404.05738: variable 'ansible_facts' from source: unknown 11661 1726882404.05835: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882404.0270607-13143-239969239573914/AnsiballZ_command.py 11661 1726882404.05983: Sending initial data 11661 1726882404.05992: Sent initial data (156 bytes) 11661 1726882404.06765: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882404.06769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882404.06805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882404.06808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882404.06811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882404.06860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882404.06874: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882404.06986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882404.08736: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11661 1726882404.08830: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11661 1726882404.08922: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11661k_nrvl51/tmpi501kmmr /root/.ansible/tmp/ansible-tmp-1726882404.0270607-13143-239969239573914/AnsiballZ_command.py <<< 11661 1726882404.09031: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11661 1726882404.10249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882404.10433: stderr chunk (state=3): >>><<< 11661 1726882404.10436: stdout chunk (state=3): >>><<< 11661 1726882404.10455: done transferring module to remote 11661 1726882404.10466: _low_level_execute_command(): starting 11661 1726882404.10469: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882404.0270607-13143-239969239573914/ /root/.ansible/tmp/ansible-tmp-1726882404.0270607-13143-239969239573914/AnsiballZ_command.py && sleep 0' 11661 1726882404.11157: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882404.11165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882404.11211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882404.11230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882404.11237: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882404.11253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 11661 1726882404.11261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882404.11320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882404.11325: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11661 1726882404.11344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882404.11449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882404.13223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882404.13278: stderr chunk (state=3): >>><<< 11661 1726882404.13282: stdout chunk (state=3): >>><<< 11661 1726882404.13297: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882404.13300: _low_level_execute_command(): starting 11661 1726882404.13303: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882404.0270607-13143-239969239573914/AnsiballZ_command.py && sleep 0' 11661 1726882404.13836: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882404.13862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882404.13906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882404.13978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882404.13984: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11661 1726882404.14006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 11661 1726882404.14014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882404.14102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882404.14123: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882404.14254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882404.46596: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 2089 0 --:--:-- --:--:-- --:--:-- 2089\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 13227 0 --:--:-- --:--:-- --:--:-- 13227", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:33:24.271532", "end": "2024-09-20 21:33:24.462900", "delta": "0:00:00.191368", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11661 1726882404.47931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 11661 1726882404.47935: stdout chunk (state=3): >>><<< 11661 1726882404.47937: stderr chunk (state=3): >>><<< 11661 1726882404.48095: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 2089 0 --:--:-- --:--:-- --:--:-- 2089\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 13227 0 --:--:-- --:--:-- --:--:-- 13227", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:33:24.271532", "end": "2024-09-20 21:33:24.462900", "delta": "0:00:00.191368", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 11661 1726882404.48104: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882404.0270607-13143-239969239573914/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11661 1726882404.48107: _low_level_execute_command(): starting 11661 1726882404.48110: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882404.0270607-13143-239969239573914/ > /dev/null 2>&1 && sleep 0' 11661 1726882404.48670: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882404.48678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11661 1726882404.48708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 11661 1726882404.48711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11661 1726882404.48714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 11661 1726882404.48716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11661 1726882404.48770: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11661 1726882404.48778: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11661 1726882404.48879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11661 1726882404.50716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11661 1726882404.50823: stderr chunk (state=3): >>><<< 11661 1726882404.50833: stdout chunk (state=3): >>><<< 11661 1726882404.50872: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11661 1726882404.51177: handler run complete 11661 1726882404.51180: Evaluated conditional (False): False 11661 1726882404.51182: attempt loop complete, returning result 11661 1726882404.51188: _execute() done 11661 1726882404.51190: dumping result to json 11661 1726882404.51192: done dumping result, returning 11661 1726882404.51194: done running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity [0e448fcc-3ce9-896b-2321-00000000056e] 11661 1726882404.51196: sending task result for task 0e448fcc-3ce9-896b-2321-00000000056e 11661 1726882404.51269: done sending task result for task 0e448fcc-3ce9-896b-2321-00000000056e 11661 1726882404.51277: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.191368", "end": "2024-09-20 21:33:24.462900", "rc": 0, "start": "2024-09-20 21:33:24.271532" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 2089 0 --:--:-- --:--:-- --:--:-- 2089 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 13227 0 --:--:-- --:--:-- --:--:-- 13227 11661 1726882404.51342: no more pending results, returning what we have 11661 1726882404.51345: results queue empty 11661 1726882404.51346: checking for any_errors_fatal 11661 1726882404.51358: done checking for any_errors_fatal 11661 1726882404.51359: checking for max_fail_percentage 11661 1726882404.51360: done checking for max_fail_percentage 11661 1726882404.51361: checking to see if all hosts have failed and the running result is not ok 11661 1726882404.51362: done checking to see if all hosts have failed 11661 1726882404.51362: getting the remaining hosts for this loop 11661 1726882404.51365: done getting the remaining hosts for this loop 11661 1726882404.51371: getting the next task for host managed_node2 11661 1726882404.51377: done getting next task for host managed_node2 11661 1726882404.51379: ^ task is: TASK: meta (flush_handlers) 11661 1726882404.51380: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882404.51383: getting variables 11661 1726882404.51384: in VariableManager get_vars() 11661 1726882404.51416: Calling all_inventory to load vars for managed_node2 11661 1726882404.51418: Calling groups_inventory to load vars for managed_node2 11661 1726882404.51419: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882404.51431: Calling all_plugins_play to load vars for managed_node2 11661 1726882404.51434: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882404.51437: Calling groups_plugins_play to load vars for managed_node2 11661 1726882404.52927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882404.53872: done with get_vars() 11661 1726882404.53894: done getting variables 11661 1726882404.53945: in VariableManager get_vars() 11661 1726882404.53958: Calling all_inventory to load vars for managed_node2 11661 1726882404.53959: Calling groups_inventory to load vars for managed_node2 11661 1726882404.53961: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882404.53966: Calling all_plugins_play to load vars for managed_node2 11661 1726882404.53968: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882404.53969: Calling groups_plugins_play to load vars for managed_node2 11661 1726882404.59480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882404.61195: done with get_vars() 11661 1726882404.61249: done queuing things up, now waiting for results queue to drain 11661 1726882404.61254: results queue empty 11661 1726882404.61255: checking for any_errors_fatal 11661 1726882404.61259: done checking for any_errors_fatal 11661 1726882404.61260: checking for max_fail_percentage 11661 1726882404.61261: done checking for max_fail_percentage 11661 1726882404.61262: checking to see if all hosts have failed and the running result is not ok 11661 1726882404.61264: done checking to see if all hosts have failed 11661 1726882404.61265: getting the remaining hosts for this loop 11661 1726882404.61266: done getting the remaining hosts for this loop 11661 1726882404.61269: getting the next task for host managed_node2 11661 1726882404.61277: done getting next task for host managed_node2 11661 1726882404.61278: ^ task is: TASK: meta (flush_handlers) 11661 1726882404.61283: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882404.61288: getting variables 11661 1726882404.61289: in VariableManager get_vars() 11661 1726882404.61303: Calling all_inventory to load vars for managed_node2 11661 1726882404.61305: Calling groups_inventory to load vars for managed_node2 11661 1726882404.61306: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882404.61311: Calling all_plugins_play to load vars for managed_node2 11661 1726882404.61312: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882404.61314: Calling groups_plugins_play to load vars for managed_node2 11661 1726882404.62197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882404.63501: done with get_vars() 11661 1726882404.63529: done getting variables 11661 1726882404.63591: in VariableManager get_vars() 11661 1726882404.63608: Calling all_inventory to load vars for managed_node2 11661 1726882404.63610: Calling groups_inventory to load vars for managed_node2 11661 1726882404.63612: Calling all_plugins_inventory to load vars for managed_node2 11661 1726882404.63617: Calling all_plugins_play to load vars for managed_node2 11661 1726882404.63620: Calling groups_plugins_inventory to load vars for managed_node2 11661 1726882404.63623: Calling groups_plugins_play to load vars for managed_node2 11661 1726882404.65050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11661 1726882404.67046: done with get_vars() 11661 1726882404.67091: done queuing things up, now waiting for results queue to drain 11661 1726882404.67094: results queue empty 11661 1726882404.67095: checking for any_errors_fatal 11661 1726882404.67096: done checking for any_errors_fatal 11661 1726882404.67097: checking for max_fail_percentage 11661 1726882404.67098: done checking for max_fail_percentage 11661 1726882404.67099: checking to see if all hosts have failed and the running result is not ok 11661 1726882404.67100: done checking to see if all hosts have failed 11661 1726882404.67101: getting the remaining hosts for this loop 11661 1726882404.67102: done getting the remaining hosts for this loop 11661 1726882404.67105: getting the next task for host managed_node2 11661 1726882404.67108: done getting next task for host managed_node2 11661 1726882404.67109: ^ task is: None 11661 1726882404.67111: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11661 1726882404.67112: done queuing things up, now waiting for results queue to drain 11661 1726882404.67113: results queue empty 11661 1726882404.67114: checking for any_errors_fatal 11661 1726882404.67114: done checking for any_errors_fatal 11661 1726882404.67115: checking for max_fail_percentage 11661 1726882404.67116: done checking for max_fail_percentage 11661 1726882404.67117: checking to see if all hosts have failed and the running result is not ok 11661 1726882404.67117: done checking to see if all hosts have failed 11661 1726882404.67120: getting the next task for host managed_node2 11661 1726882404.67122: done getting next task for host managed_node2 11661 1726882404.67123: ^ task is: None 11661 1726882404.67124: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=76 changed=3 unreachable=0 failed=0 skipped=60 rescued=0 ignored=0 Friday 20 September 2024 21:33:24 -0400 (0:00:00.707) 0:00:33.385 ****** =============================================================================== Install dnsmasq --------------------------------------------------------- 2.88s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 1.84s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.84s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create test interfaces -------------------------------------------------- 1.71s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Gathering Facts --------------------------------------------------------- 1.34s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:6 Install pgrep, sysctl --------------------------------------------------- 1.33s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.27s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.93s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.92s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.85s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 0.80s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.76s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Verify DNS and network connectivity ------------------------------------- 0.71s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Gather the minimum subset of ansible_facts required by the network role test --- 0.70s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.63s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.63s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.54s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Get NM profile info ----------------------------------------------------- 0.51s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Remove test interfaces -------------------------------------------------- 0.48s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Delete the device 'nm-bond' --------------------------------------------- 0.48s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:114 11661 1726882404.67254: RUNNING CLEANUP